Skip to main content
Journal of Applied Behavior Analysis logoLink to Journal of Applied Behavior Analysis
. 2003 Fall;36(3):387–406. doi: 10.1901/jaba.2003.36-387

Visual aids and structured criteria for improving visual inspection and interpretation of single-case designs.

Wayne W Fisher 1, Michael E Kelley 1, Joanna E Lomas 1
PMCID: PMC1284456  PMID: 14596583

Abstract

Because behavior analysis is a data-driven process, a critical skill for behavior analysts is accurate visual inspection and interpretation of single-case data. Study 1 was a basic study in which we increased the accuracy of visual inspection methods for A-B designs through two refinements of the split-middle (SM) method called the dual-criteria (DC) and conservative dual-criteria (CDC) methods. The accuracy of these visual inspection methods was compared with one another and with two statistical methods (Allison & Gorman, 1993; Gottman, 1981) using a computer-simulated Monte Carlo study. Results indicated that the DC and CDC methods controlled Type I error rates much better than the SM method and had considerably higher power (to detect real treatment effects) than the two statistical methods. In Study 2, brief verbal and written instructions with modeling were used to train 5 staff members to use the DC method, and in Study 3, these training methods were incorporated into a slide presentation and were used to rapidly (i.e., 15 min) train a large group of individuals (N = 87). Interpretation accuracy increased from a baseline mean of 55% to a treatment mean of 94% in Study 2 and from a baseline mean of 71% to a treatment mean of 95% in Study 3. Thus, Study 1 answered basic questions about the accuracy of several methods of interpreting A-B designs; Study 2 showed how that information could be used to increase the accuracy of human visual inspectors; and Study 3 showed how the training procedures from Study 2 could be modified into a format that would facilitate rapid training of large groups of individuals to interpret single-case designs.

Full Text

The Full Text of this article is available as a PDF (162.8 KB).

Selected References

These references are in PubMed. This may not be the complete list of references from this article.

  1. Baer D. M. "Perhaps it would be better not to know everything.". J Appl Behav Anal. 1977 Spring;10(1):167–172. doi: 10.1901/jaba.1977.10-167. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Bailey D. B. Effects of lines of progress and semilogarithmic charts on ratings of charted data. J Appl Behav Anal. 1984 Fall;17(3):359–365. doi: 10.1901/jaba.1984.17-359. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Boykin R. A., Nelson R. O. The effects of instructions and calculation procedures on observers' accuracy, agreement, and calculation correctness. J Appl Behav Anal. 1981 Winter;14(4):479–489. doi: 10.1901/jaba.1981.14-479. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Deprospero A., Cohen S. Inconsistent visual analyses of intrasubject data. J Appl Behav Anal. 1979 Winter;12(4):573–579. doi: 10.1901/jaba.1979.12-573. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Hagopian L. P., Fisher W. W., Thompson R. H., Owen-DeSchryver J., Iwata B. A., Wacker D. P. Toward the development of structured criteria for interpretation of functional analysis data. J Appl Behav Anal. 1997 Summer;30(2):313–326. doi: 10.1901/jaba.1997.30-313. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Harbst K. B., Ottenbacher K. J., Harris S. R. Interrater reliability of therapists' judgements of graphed data. Phys Ther. 1991 Feb;71(2):107–115. doi: 10.1093/ptj/71.2.107. [DOI] [PubMed] [Google Scholar]
  7. Jones R. R., Weinrott M. R., Vaught R. S. Effects of serial dependency on the agreement between visual and statistical inference. J Appl Behav Anal. 1978 Summer;11(2):277–283. doi: 10.1901/jaba.1978.11-277. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Matyas T. A., Greenwood K. M. Visual analysis of single-case time series: Effects of variability, serial dependence, and magnitude of intervention effects. J Appl Behav Anal. 1990 Fall;23(3):341–351. doi: 10.1901/jaba.1990.23-341. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Michael J. Statistical inference for individual organism research: mixed blessing or curse? J Appl Behav Anal. 1974 Winter;7(4):647–653. doi: 10.1901/jaba.1974.7-647. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Ottenbacher K. J. Visual inspection of single-subject data: an empirical analysis. Ment Retard. 1990 Oct;28(5):283–290. [PubMed] [Google Scholar]
  11. Page T. J., Iwata B. A., Reid D. H. Pyramidal training: a large-scale application with institutional staff. J Appl Behav Anal. 1982 Fall;15(3):335–351. doi: 10.1901/jaba.1982.15-335. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Reid D. H., Parsons M. B. Comparing choice and questionnaire measures of the acceptability of a staff training procedure. J Appl Behav Anal. 1995 Spring;28(1):95–96. doi: 10.1901/jaba.1995.28-95. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Applied Behavior Analysis are provided here courtesy of Society for the Experimental Analysis of Behavior

RESOURCES