Abstract
Dixon and colleagues (2015) asserted that faculty research productivity is a key indicator of the quality of university programs that train future practitioners of behavior analysis. Based on their analysis of publications in select journals, the authors concluded that many faculty in such programs have published little to no research. Some alternative measures of both faculty research productivity and the quality of practitioner training programs are suggested here.
Keywords: Practitioners, Training, Graduate programs, Research
Evaluating the Quality of Behavior Analytic Practitioner Training Programs
Dixon et al. (2015) raised important questions about graduate training in applied behavior analysis. Their rankings of graduate programs in terms of faculty research productivity were couched in several uncontroversial assumptions: Standards for evaluating training programs are desirable; indices of program quality can be helpful; our discipline needs skilled researchers; and practitioners-in-training benefit from exposure to research. Assumptions that warrant examination include those underlying the authors’ finding of a discrepancy in the research productivity of faculty in two sets of programs and the implication that programs in which faculty research productivity appears low are inferior. Those assumptions and alternative measures of faculty research productivity and program quality are considered here.
Alternative Measures of Research Productivity
An evaluation of training programs must begin by selecting programs for inclusion. Dixon and colleagues sought to compare faculty in programs accredited by the Association for Behavior Analysis International (ABAI) with faculty in what they characterized as “BACB [Behavior Analyst Certification Board] training programs” (pp. 7 and 15). Unfortunately, that is like comparing apples with oranges, because the BACB does not accredit, approve, or endorse programs. It only pre-approves courses and practica that fulfill its coursework and experiential training eligibility requirements. Thus the two sets of programs compared by Dixon et al. were not well-matched, which likely affected the findings. For instance, the ABAI set probably included a larger proportion of doctoral programs, which may have more factors in place to foster research than many master’s-degree programs. Future evaluations might include only training programs that are accredited by ABAI and have BACB-approved course sequences and practica, and separate doctoral from master’s degree programs. It might be best to conduct such evaluations when there is a larger pool of programs that meet the foregoing criteria than there is today.
Faculty research productivity could be measured in several ways. Dixon and colleagues examined publications in two journals published by the Society for the Experimental Analysis of Behavior and four published by ABAI. One of the latter—The Behavior Analyst—does not publish original research; it publishes literature reviews, reinterpretations of published data, discussion articles, and articles on philosophy. Since Dixon et al. counted articles in those latter categories, they actually measured publishing productivity, not research productivity. Their inclusion criteria may have favored programs that prepare students for careers in academia and research. A more accurate picture of the research productivity of practice-oriented faculty could be obtained by counting only research publications and searching additional journals in behavior analysis, education, human development, developmental disabilities, psychology, rehabilitation, and organizational behavior management.
Alternative Measures of Program Quality
Ideally, measures of program quality should include data from experimental analyses of relations among faculty, practitioner, and client repertoires. It would be interesting to see how many of those relations have been examined in experimental—or even correlational—studies and what those studies have shown. Dixon and colleagues cited a correlation between faculty publications and employer satisfaction with graduates of psychology programs (p. 9). Employer ratings of graduates of behavior analytic programs could be valuable, but direct, objective measures of graduates’ repertoires might better reflect program quality. One available measure is BACB certification exam pass rates. Future evaluations might examine relations among those data and various measures of faculty performance.
The question of whether practitioners must be trained to conduct research has been debated extensively (Critchfield 2015; Kelley et al. 2015; Moore and Shook 2001). There appears to be little research on that question and many opinions. Dixon et al. echoed an opinion expressed by some behavior analysts (e.g., Baer 1992; Reid 1992), but others have argued that practitioners-to-be should be trained to apply behavioral technologies and evaluate their effects on clients, not to conduct research (e.g., Johnston 1996; Malott 1992; Shook et al. 2002). Still others have suggested that practitioners need to be critical consumers of research (e.g., Green 2010).
Instead of relying on a few opinions, many professions use job analysis studies to identify essential competencies and the content of credentialing examinations, exam eligibility requirements, training curricula, and credentialed professionals’ scope of work. There are well-established legal and psychometric standards for conducting such studies. They typically entail having panels of experts develop a list of knowledge and skills required of a competent practitioner. That task list becomes the contents of a survey on which a large pool of professionals rate the importance of each task. The resulting data are used to determine which tasks remain on the list. The final task list is used to develop exam items and often to set formal and experiential training requirements. Several job analyses of the practice of behavior analysis have been conducted by the BACB and the Florida Department of Children and Families. They involved thousands of behavior analysts—basic and applied researchers and academics as well as practitioners (Behavior Analyst Certification Board 2011; Shook et al. 1995, 2004).
The task list resulting from the most recent BACB job analysis includes behavior analytic measurement and research designs, so those topics must be covered in BACB-approved courses (Behavior Analyst Certification Board 2015). Although research has not yet determined if proficiency in publishing research is necessary or sufficient for training others in research methods, performing a set of tasks and training others to perform them clearly involve distinct repertoires. Therefore, if faculty research productivity is used to index training program quality, it should be accompanied by some evidence of proficiency in implementing effective procedures for training graduate students in research skills.
The aforementioned job analysis studies have identified many additional competencies for behavior analytic practitioners. Abundant research suggests that to develop those competencies, faculty must be skilled in describing, demonstrating, observing, and providing feedback on trainee performance of a very large array of procedures for assessing behavior-environment interactions, producing generalized behavior change, and arranging systems to support effective service delivery, among others. It follows that objective measures of those repertoires should be weighted as much as or more than faculty research productivity in evaluating the quality of behavior analytic practitioner training programs.
Author Notes
Authors are listed alphabetically; all contributed equally. The Association of Professional Behavior Analysts (APBA) has not reviewed or approved the content of this article. The APBA does not endorse or sponsor this article and is not otherwise affiliated with this article. The content represents the opinions of the authors, not the opinion or position of the APBA.
William H. Ahearn, PhD, BCBA-D, LABA is Director of Research at the New England Center for Children and has conducted basic, translational, and applied research that has been published in a wide variety of outlets. He is on the faculty of Western New England University’s doctoral and master’s programs and is an adjunct for the University of Massachusetts Medical School/E.K. Shriver Center. Bill is past president of the Association of Professional Behavior Analysts and the Berkshire Association for Behavior Analysis and Therapy. He currently serves as the Chair of the Board of Registration of Allied Mental Health and Human Services Professions in Massachusetts.
Gina Green, PhD, BCBA-D has conducted basic and applied research in behavior analysis, designed and overseen comprehensive and focused ABA interventions, trained behavior analytic researchers and practitioners, and worked on public policies affecting the practice of ABA. She has been president of the Association for Behavior Analysis International and the California Association for Behavior Analysis and has served on the ABAI Accreditation Board and the Board of Directors of the Behavior Analyst Certification Board. Gina is currently the Executive Director of the Association of Professional Behavior Analysts.
Mary M. Riordan, PhD, BCBA-D is the Developmental Disabilities Director for Behavior Management Consultants, Inc. Her work involves providing direct services to individuals, consultation with human service organizations, development of policy related to the practice of applied behavior analysis, and the hiring and supervision of other behavior analysts. Currently, Mary is the president of the Association of Professional Behavior Analysts as well as the Public Policy Chair and past president of the Florida Association for Behavior Analysis.
Nicholas L. Weatherly, PhD, BCBA-D is a consultant with Aubrey Daniels International, specializing in fluency training, e-learning, organizational assessment, and coaching systems. He was the inaugural chair of the Kentucky Applied Behavior Analyst Licensing Board, is the current president of the Georgia Association for Behavior Analysis, and is president-elect of the Association of Professional Behavior Analysts.
References
- Baer DM. Teacher proposes, student disposes. Journal of Applied Behavior Analysis. 1992;25:89–92. doi: 10.1901/jaba.1992.25-89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Behavior Analyst Certification Board (2011). BACB Online Newsletter. Resource document. Behavior Analyst Certification Board, Inc. http://www.bacb.com/newsletter/BACB_Newsletter_05_2011.pdf Accessed June 16, 2015.
- Behavior Analyst Certification Board (2015). Fourth Edition Task List. Resource document. Behavior Analyst Certification Board, Inc. http://www.bacb.com/Downloadfiles/TaskList/BACB_Fourth_Edition_Task_List.pdf. Accessed June 16, 2015.
- Critchfield TS. What counts as high quality practitioner training in applied behavior analysis? Behavior Analysis in Practice. 2015;8:3–6. doi: 10.1007/s40617-015-0049-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dixon MR, Reed DD, Smith T, Belisle J, Jackson RE. Research rankings of behavior analytic graduate training programs and their faculty. Behavior Analysis in Practice. 2015;8:7–15. doi: 10.1007/s40617-015-0057-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Green G. Training practitioners to evaluate evidence about interventions. European Journal of Behavior Analysis. 2010;10:223–228. [Google Scholar]
- Johnston JM. Distinguishing between applied research and practice. The Behavior Analyst. 1996;19:35–47. doi: 10.1007/BF03392737. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kelley DP, III, Wilder DA, Carr JE, Rey C, Green N, Lipschultz J. Research productivity among practitioners in behavior analysis: recommendations from the prolific. Behavior Analysis in Practice. 2015 doi: 10.1007/s40617-015-0064-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Malott RW. Should we train applied behavior analysts to be researchers? Journal of Applied Behavior Analysis. 1992;25:83–88. doi: 10.1901/jaba.1992.25-83. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moore J, Shook GL. Certification, accreditation, and quality control in behavior analysis. The Behavior Analyst. 2001;24:45–55. doi: 10.1007/BF03392018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reid DH. The need to train more behavior analysts to be better applied researchers. Journal of Applied Behavior Analysis. 1992;25:97–99. doi: 10.1901/jaba.1992.25-97. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shook GL, Hartsfield F, Hemingway M. Essential content for training behavior analysis practitioners. The Behavior Analyst. 1995;18:83–91. doi: 10.1007/BF03392694. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shook GL, Ala’i-Rosales S, Glenn SS. Training and certifying behavior analysts. Behavior Modification. 2002;26:27–48. doi: 10.1177/0145445502026001003. [DOI] [PubMed] [Google Scholar]
- Shook GL, Johnston JM, Mellichamp F. Determining essential content for applied behavior analyst practitioners. The Behavior Analyst. 2004;27:67–94. doi: 10.1007/BF03392093. [DOI] [PMC free article] [PubMed] [Google Scholar]