Skip to main content
Behavior Analysis in Practice logoLink to Behavior Analysis in Practice
. 2015 Aug 4;8(2):156–158. doi: 10.1007/s40617-015-0075-y

An Alternative Measure of Research Productivity Among Behavior Analytic Graduate Training Programs: a Response to Dixon et al. (2015)

David A Wilder 1,, Joshua L Lipschultz 1, David P Kelley III 1, Catalina Rey 1, Amy Enderli 1
PMCID: PMC5048275  PMID: 27703911

Dixon et al. (2015) examined the research productivity of 74 graduate training programs which are either accredited by the Association for Behavior Analysis, International (ABAI), have a course sequence approved by the Behavior Analysis Certification Board™ (BACB), or both. These authors provided a list of the top 10 research-producing programs and individuals for each of six journals in the field of behavior analysis. Although informative, Dixon et al.’s analysis provides only one measure of research productivity. Research productivity can, and should, be measured in many different ways.

Dixon et al. analyzed total number of individual and program publications through 2013. This measure favors long-existing programs; new programs have not had the opportunity to produce the amount of research produced by many long-existing programs. In addition, Dixon et al. examined journals in the field of behavior analysis, but some of these journals do not publish applied behavior analytic content. To the extent that research productivity is an important measure of the quality of practitioner training programs (of course, this point is controversial and will be left to other commentaries), it is reasonable to argue that graduate training programs in applied behavior analysis should be evaluated based on their research productivity in applied behavior analysis. To this end, we examined the most research-productive programs evaluated by Dixon et al. in the 15-year period from 2000 to 2014. To emphasize the importance of research, our examination focused exclusively on empirical studies; discussion articles, reviews, and conceptual papers were excluded. In addition, we examined six applied behavior analysis journals, as our interest was in the applied domain.

We reviewed all issues of Behavior Analysis in Practice (BAP), Behavioral Interventions (BIN), the Journal of Applied Behavior Analysis (JABA), the Journal of Behavioral Education (JBED), the Journal of Organizational Behavior Management (JOBM), and The Analysis of Verbal Behavior (TAVB) published between 2000 and 2014. We obtained affiliation information from articles that met the following criteria: (a) the article included at least one research participant and (b) the article contained a method section, results section, and discussion section (although the results and discussion sections may have been combined). Although other journals regularly publish applied behavior analytic content (e.g., Behavior Modification, Education and Treatment of Children, Journal of Positive Behavioral Interventions, Research in Autism Spectrum Disorders, Research in Developmental Disabilities), these journals also publish non-behavior analytic studies and were therefore excluded to ensure that only behavior analytic articles were counted in our review.

We counted affiliations by providing each institution (program) one publication credit per article (Shabani et al. 2004). Once counted, we summed publication credits for all programs. We gave credit to the first six authors and their programs for each article (but no more than six because the American Psychological Association recognizes only the first six in its citation system), regardless of authorship order. Programs did not receive more than one credit if the article had more than one author affiliated with that particular program. We then summed the publication credits to determine the top 10 programs. A second observer independently scored at least 25 % of the volumes from each journal. Point-by-point intercoder agreement was 98.8 % (range, 96.9 to 100 %).

To obtain a measure of research impact, we identified the year of publication and the number of citations for each article (obtained from Google Scholar™) published between 2000 and 2014 in the six journals we targeted by each of the top 10 training programs. We then subtracted the year of publication from 2015 to measure years since publication. We then divided the number of citations for each article by the years since publication for that article to obtain a measure of rate of citations per year for each article. Finally, we calculated the mean rate of citations per year for the combined articles published by each program. A second observer independently scored 30 % of programs; intercoder agreement was 100 %.

The most productive programs, of the 74 analyzed by Dixon et al. (2015), are listed in Table 1. The mean rate of citations per year for articles published in our target journals during our target time frame for each of the top 10 programs is listed in Table 2. For reference, the top 10 programs in terms of total publications as listed by Dixon et al. are provided in Table 3. As can be seen in the tables, the findings of the current analysis differ from those of Dixon et al.

Table 1.

The 10 most research-productive (in applied behavior analysis) behavior analytic graduate training programs (2000–2014)

Rank Institution Publications
1 University of Florida 115
2 University of Kansas 95
3 Western Michigan University 90
4 Southern Illinois University 65
5 Florida Institute of Technology 62
6 University of Nevada, Reno 58
7 Queens College, City University of New York 50
8 Western New England University 38
9 Ohio State University 37
10 University of the Pacific 37

Table 2.

Mean citation rate per year for articles published (in applied behavior analysis) by the 10 most research-productive behavior analytic graduate training programs (2000–2014)

Rank Institution Mean citation rate per year
1 University of Florida 5.67
2 Queens College, City University of New York 3.89
3 Ohio State University 3.30
4 University of Nevada, Reno 2.84
5 Southern Illinois University 2.62
6 Western Michigan University 2.48
7 University of Kansas 2.37
8 University of the Pacific 2.15
9 Western New England University 1.96
10 Florida Institute of Technology 1.56

Table 3.

The top 10 programs by total publications according to Dixon, Reed, Smith, & Belisle (2015)

Rank Institution
1 University of Maryland, Baltimore County
2 University of Florida
3 Western New England University
4 University of Kansas
5 Western Michigan University
6 West Virginia University
7 Southern Illinois University
8 University of Wisconsin, Milwaukee
9 University of Nevada, Reno
10 Florida Institute of Technology

The number of publications was not provided

As with Dixon et al. (2015), this analysis includes a number of limitations. First, because only empirical studies were counted, the results may underestimate the productivity of some of the programs listed. Second, we included only six applied journals in our analysis. Many program faculty also publish in other applied outlets that were not included. Future research might include studies published in other non-behavior analytic journals. Publishing in non-behavior analytic outlets may be particularly beneficial for the field, as it helps disseminate the utility of behavior analysis to other disciplines. Finally, because some researchers may have moved to another program between 2000 and 2014, it is possible that some programs may have been given credit for authors who no longer work there. Due to these limitations, the current report, along with the results of Dixon et al. (2015) should be seen as one way of analyzing the research productivity of applied behavior analytic training programs. Other methods of analysis will undoubtedly yield other outcomes.

Although we agree with Dixon et al. (2015) that measuring the research productivity of graduate training programs in ABA is useful, particularly for prospective graduate students and other consumers (e.g., employers), our data suggest that productivity can be measured in many ways. As a field, we first need to establish which data are most meaningful to consumers and then routinely collect and report these data in our professional journals.

Footnotes

Author Note

David A. Wilder is a Professor in the School of Behavior Analysis and the Chair of the on-campus program in behavior analysis at the Florida Institute of Technology in Melbourne, FL. He teaches, supervises student practicum and research, and consults to agencies providing behavior analytic services. Joshua L. Lipschultz, David P. Kelley III, Catalina Rey, and Amy Enderli are graduate students in behavior analysis at the Florida Institute of Technology.

References

  1. Dixon MR, Reed D, Smith T, Belisle J, Jackson RE. Research ranking of behavior analytic graduate training programs and their faculty. Behavior Analysis in Practice. 2015;8:7–15. doi: 10.1007/s40617-015-0057-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Shabani DB, Carr JE, Petursdottir AI, Esch BE, Gillett JN. Scholarly productivity in behavior analysis: the most prolific authors and institutions from 1992 to 2001. The Behavior Analyst Today. 2004;5:235–243. doi: 10.1037/h0100035. [DOI] [Google Scholar]

Articles from Behavior Analysis in Practice are provided here courtesy of Association for Behavior Analysis International

RESOURCES