Behavior Analysis in Practice

, Volume 12, Issue 2, pp 491–502 | Cite as

Systematic Protocols for the Visual Analysis of Single-Case Research Data

  • Katie WolfeEmail author
  • Erin E. Barton
  • Hedda Meadan
Technical and Tutorials


Researchers in applied behavior analysis and related fields such as special education and school psychology use single-case designs to evaluate causal relations between variables and to evaluate the effectiveness of interventions. Visual analysis is the primary method by which single-case research data are analyzed; however, research suggests that visual analysis may be unreliable. In the absence of specific guidelines to operationalize the process of visual analysis, it is likely to be influenced by idiosyncratic factors and individual variability. To address this gap, we developed systematic, responsive protocols for the visual analysis of A-B-A-B and multiple-baseline designs. The protocols guide the analyst through the process of visual analysis and synthesize responses into a numeric score. In this paper, we describe the content of the protocols, illustrate their application to 2 graphs, and describe a small-scale evaluation study. We also describe considerations and future directions for the development and evaluation of the protocols.


Visual analysis Single-case research Visual inspection Data analysis 


Compliance with Ethical Standards

Conflict of Interest

Katie Wolfe declares that she has no conflict of interest. Erin E. Barton declares that she has no conflict of interest. Hedda Meadan declares that she has no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The University of South Carolina Institutional Review Board approved the procedures in this study.

Informed Consent

Informed consent was obtained from all individual participants included in the study.


  1. Barton, E. E., Ledford, J. R., Lane, J. D., Decker, J., Germansky, S. E., Hemmeter, M. L., & Kaiser, A. (2016). The iterative use of single case research designs to advance the science of EI/ECSE. Topics in Early Childhood Special Education, 36(1), 4–14. Scholar
  2. Barton, E. E., Lloyd, B. P., Spriggs, A. D., & Gast, D. L. (2018). Visual analysis of graphic data. In J. R. Ledford & D. L. Gast (Eds.), Single-case research methodology: Applications in special education and behavioral sciences (pp. 179–213). New York, NY: Routledge.CrossRefGoogle Scholar
  3. Barton, E. E., Meadan, H., & Fettig, A. (2019). Comparison of visual analysis, non-overlap methods, and effect sizes in the evaluation of parent implemented functional assessment based interventions. Research in Developmental Disabilities, 85, 31–41. Scholar
  4. Brossart, D. F., Parker, R. I., Olson, E. A., & Mahadevan, L. (2006). The relationship between visual analysis and five statistical analyses in a simple AB single-case research design. Behavior Modification, 30, 531–563. Scholar
  5. Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6(4), 284–290. Scholar
  6. Cooper, C. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis. St. Louis: Pearson Education.Google Scholar
  7. DeProspero, A., & Cohen, S. (1979). Inconsistent visual analyses of intrasubject data. Journal of Applied Behavior Analysis, 12(4), 573–579. Scholar
  8. Fisch, G. S. (1998). Visual inspection of data revisited: Do the eyes still have it? The Behavior Analyst, 21(1), 111–123. Scholar
  9. Furlong, M. J., & Wampold, B. E. (1982). Intervention effects and relative variation as dimensions in experts’ use of visual inference. Journal of Applied Behavior Analysis, 15(3), 415–421. Scholar
  10. Hagopian, L. P., Fisher, W. W., Thompson, R. H., Owen-DeSchryver, J., Iwata, B. A., & Wacker, D. P. (1997). Toward the development of structured criteria for interpretation of functional analysis data. Journal of Applied Behavior Analysis, 30(2), 313–326. Scholar
  11. Hallgren, K. A. (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutorial in Quantitative Methods for Psychology, 8(1), 23–34.CrossRefGoogle Scholar
  12. Hitchcock, J. H., Horner, R. H., Kratochwill, T. R., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. M. (2014). The what works Clearinghouse single-case design pilot standards: Who will guard the guards? Remedial and Special Education, 35(3), 145–152. Scholar
  13. Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S. L., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practices in special education. Exceptional Children, 71, 165–179. Scholar
  14. Horner, R. H., & Spaulding, S. A. (2010). Single-subject designs. In N. E. Salkind (Ed.), The encyclopedia of research design (Vol. 3, pp. 1386–1394). Thousand Oaks: Sage Publications.Google Scholar
  15. Horner, R. H., Swaminathan, H., Sugai, G., & Smolkowski, K. (2012). Considerations for the systematic analysis and use of single-case research. Education and Treatment of Children, 35(2), 269–290. Scholar
  16. Kahng, S. W., Chung, K. M., Gutshall, K., Pitts, S. C., Kao, J., & Girolami, K. (2010). Consistent visual analyses of intrasubject data. Journal of Applied Behavior Analysis, 43(1), 35–45. Scholar
  17. Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings (2nd ed.). New York: Oxford University Press.Google Scholar
  18. Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26–38. Scholar
  19. Ledford, J. R., & Gast, D. L. (2018). Single case research methodology: Applications in special education and behavioral sciences. New York: Routledge.CrossRefGoogle Scholar
  20. Lieberman, R. G., Yoder, P. J., Reichow, B., & Wolery, M. (2010). Visual analysis of multiple baseline across participants graphs when change is delayed. School Psychology Quarterly, 25(1), 28–44. Scholar
  21. Maggin, D. M., Briesch, A. M., & Chafouleas, S. M. (2013). An application of the what works Clearinghouse standards for evaluating single subject research: Synthesis of the self-management literature base. Remedial and Special Education, 34(1), 44–58. Scholar
  22. Penny, J., Johnson, R. L., & Gordon, B. (2000a). The effect of rating augmentation on inter-rater reliability: An empirical study of a holistic rubric. Assessing Writing, 7(2), 143–164. Scholar
  23. Penny, J., Johnson, R. L., & Gordon, B. (2000b). Using rating augmentation to expand the scale of an analytic rubric. Journal of Experimental Education, 68(3), 269–287. Scholar
  24. Scruggs, T. E., & Mastropieri, M. A. (1998). Summarizing single-subject research: Issues and applications. Behavior Modification, 22(3), 221–242. Scholar
  25. Shadish, W. R. (2014). Statistical analyses of single-case designs: The shape of things to come. Current Directions in Psychological Science, 23(2), 139–146. Scholar
  26. Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43(4), 971–980. Scholar
  27. Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428. Scholar
  28. Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17, 510–550. Scholar
  29. What Works Clearinghouse. (2017). Procedures and standards handbook (Version 4.0). Retrieved from Accessed 9 Jan 2018.
  30. Wolfe, K., Seaman, M. A., & Drasgow, E. (2016). Interrater agreement on the visual analysis of individual tiers and functional relations in multiple baseline designs. Behavior Modification, 40(6), 852–873. Scholar

Copyright information

© Association for Behavior Analysis International 2019

Authors and Affiliations

  1. 1.Department of Educational StudiesUniversity of South CarolinaColumbiaUSA
  2. 2.Department of Special EducationVanderbilt UniversityNashvilleUSA
  3. 3.Department of Special EducationUniversity of Illinois at Urbana–ChampaignChampaignUSA

Personalised recommendations