Advertisement

Perspectives on Behavior Science

, Volume 42, Issue 1, pp 153–162 | Cite as

Using Single-Case Designs in Practical Settings: Is Within-Subject Replication Always Necessary?

  • Marc J. LanovazEmail author
  • Stéphanie Turgeon
  • Patrick Cardinal
  • Tara L. Wheatley
Article

Abstract

Behavior analysts have widely adopted and embraced within-subject replication through the use of reversal and multielement designs. However, the withdrawal of treatment, which is central to these designs, may not be desirable, feasible, or even ethical in practical settings. To examine this issue, we extracted 501 ABAB graphs from theses and dissertations to examine to what extent we would have reached correct or incorrect conclusions if we had based our analysis on the initial AB component only. In our first experiment, we examined the proportion of datasets for which the results of the first AB component matched the results of the subsequent phase reversals. In our second experiment, we calculated three effect size estimates for the same datasets to examine whether these measures could predict the relevance of conducting a within-subject replication. Our analyses indicated that the initial effects were successfully replicated at least once in approximately 85% of the cases and that effect size may predict the probability of within-subject replication. Overall, our results support the rather controversial proposition that it may be possible to set threshold values of effect size above which conducting a replication could be considered unnecessary. That said, more research is needed to confirm and examine the generalizability of these results prior to recommending changes in practice.

Keywords

AB design Effect size Error rate Replication Single-case design 

Notes

Funding Information

This research project was supported in part by a salary award (no. 30827) and a grant (no. 32612) from the Fonds de Recherche du Québec—Santé as well as a grant from the Canadian Institutes of Health Research (no. 136895) to the first author.

References

  1. Christ, T. J. (2007). Experimental control and threats to internal validity of concurrent and nonconcurrent multiple baseline designs. Psychology in the Schools, 44, 451–459.  https://doi.org/10.1002/pits.20237.CrossRefGoogle Scholar
  2. Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159.  https://doi.org/10.1037/0033-2909.112.1.155.CrossRefGoogle Scholar
  3. Ferron, J. M., & Levin, J. R. (2014). Single-case permutation and randomization statistical tests: present status, promising new developments. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: methodological and statistical advances (pp. 153–183). Washington, DC: American Psychological Association.  https://doi.org/10.1037/14376-006.CrossRefGoogle Scholar
  4. Fisher, W. W., Kelley, M. E., & Lomas, J. E. (2003). Visual aids and structured criteria for improving visual inspection and interpretation of single-case designs. Journal of Applied Behavior Analysis, 36, 387–406.  https://doi.org/10.1901/jaba.2003.36-387.CrossRefGoogle Scholar
  5. Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71, 165–179.  https://doi.org/10.1177/001440290507100203.Google Scholar
  6. Johnston, J. M., Carr, J. E., & Mellichamp, F. H. (2017). A history of the professional credentialing of applied behavior analysts. The Behavior Analyst. Advanced online publication, 40, 523–538.  https://doi.org/10.1007/s40614-017-0106-9.CrossRefGoogle Scholar
  7. Kazdin, A. E. (2011). Single-case research designs (2nd ed.). New York: Oxford University Press.Google Scholar
  8. Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., &Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from http://files.eric.ed.gov/fulltext/ED510743.pdf
  9. Krueger, T. K., Rapp, J. T., Ott, L. M., Lood, E. A., & Novotny, M. A. (2013). Detecting false positives in A-B designs: potential implications for practitioners. Behavior Modification, 37, 615–630.  https://doi.org/10.1177/0145445512468754.CrossRefGoogle Scholar
  10. Lanovaz, M. J., Huxley, S. C., & Dufour, M.-M. (2017). Using the dual-criteria methods to supplement visual inspection: an analysis of nonsimulated data. Journal of Applied Behavior Analysis, 50, 662–667.  https://doi.org/10.1002/jaba.394.CrossRefGoogle Scholar
  11. Levin, J. R., Ferron, J. M., & Kratochwill, T. R. (2012). Nonparametric statistical tests for single-case systematic and randomized ABAB… AB and alternating treatment intervention designs: new developments, new directions. Journal of School Psychology, 50, 599–624.  https://doi.org/10.1016/j.jsp.2012.05.001.CrossRefGoogle Scholar
  12. Ma, H. H. (2006). An alternative method for quantitative synthesis of single-subject researches: percentage of data points exceeding the median. Behavior Modification, 30, 598–617.  https://doi.org/10.1177/0145445504272974.CrossRefGoogle Scholar
  13. Marquis, J. G., Horner, R. H., Carr, E. G., Turnbull, A. P., Thompson, M., Behrens, G. A., et al. (2000). A meta-analysis of positive behavior support. In R. Gersten, E. P. Schiller, & S. Vaughn (Eds.), Contemporary special education research: syntheses of knowledge base on critical instructional issues (pp. 137–178). Mahwah: Erlbaum.Google Scholar
  14. Moeyaert, M., Maggin, D., & Verkuilen, J. (2016). Reliability, validity, and usability of data extraction programs for single-case research designs. Behavior Modification, 40, 874–900.  https://doi.org/10.1177/0145445516645763.CrossRefGoogle Scholar
  15. Ninci, J., Vannest, K. J., Willson, V., & Zhang, N. (2015). Interrater agreement between visual analysts of single-case data: a meta-analysis. Behavior Modification, 39, 510–541.  https://doi.org/10.1177/0145445515581327.CrossRefGoogle Scholar
  16. Novotny, M. A., Sharp, K. J., Rapp, J. T., Jelinski, J. D., Lood, E. A., & Steffes, A. K. (2014). False positives with visual analysis for nonconcurrent multiple baseline designs and ABAB designs: preliminary findings. Research in Autism Spectrum Disorders, 8, 933–943.  https://doi.org/10.1016/j.rasd.2014.04.009.CrossRefGoogle Scholar
  17. Ombudsman Ontatrio. (2016). Nowhere to turn: investigation into the Ministry of Community and Social Services’ response to situations of crisis involving adults with developmental disabilities Retrieved from https://www.ombudsman.on.ca/Files/sitemedia/Documents/NTT-Final-EN-w-cover.pdf
  18. Parker, R. I., Vannest, K. J., & Davis, J. L. (2011). Effect size in single-case research: a review of nine nonoverlap techniques. Behavior Modification, 35, 303–322.  https://doi.org/10.1177/0145445511399147.CrossRefGoogle Scholar
  19. Pustejovsky, J. E. (2016, November 3). What is Tau-U? Retrieved from http://jepusto.github.io/What-is-Tau-U
  20. Québec Ombudsman. (2012). Services provided to young people and adults with a pervasive developmental disorder. Retrieved from https://protecteurducitoyen.qc.ca/sites/default/files/pdf/rapports_speciaux/2012-05-23_rapport_ted_2_EN.pdf
  21. Rohatgi, A. (2017). Plot Digitizer [computer software]. Retrieved from http://arohatgi.info/WebPlotDigitizer/app/
  22. Rogers, L. A., & Graham, S. (2008). A meta-analysis of single subject design writing intervention research. Journal of Educational Psychology, 100, 879–906.  https://doi.org/10.1037/0022-0663.100.4.879.CrossRefGoogle Scholar
  23. Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43, 971–980.  https://doi.org/10.3758/s13428-011-0111-y.CrossRefGoogle Scholar
  24. Sham, E., & Smith, T. (2014). Publication bias in studies of an applied behavior-analytic intervention: an initial analysis. Journal of Applied Behavior Analysis, 47, 663–678.  https://doi.org/10.1002/jaba.146.CrossRefGoogle Scholar
  25. Shook, G. L. (2005). An examination of the integrity and future of the Behavior Analyst Certification Board® credentials. Behavior Modification, 29, 562–574.  https://doi.org/10.1177/0145445504274203.CrossRefGoogle Scholar
  26. Skinner, B. F. (1991). The behavior of organisms: an experimental analysis. Acton: Copley Publishing Group (Original work published 1938).Google Scholar
  27. Vannest, K. J., & Ninci, J. (2015). Evaluating intervention effects in single-case research designs. Journal of Counseling & Development, 93, 403–411.  https://doi.org/10.1002/jcad.12038.CrossRefGoogle Scholar

Copyright information

© Association for Behavior Analysis International 2018

Authors and Affiliations

  • Marc J. Lanovaz
    • 1
    Email author
  • Stéphanie Turgeon
    • 2
  • Patrick Cardinal
    • 3
  • Tara L. Wheatley
    • 4
  1. 1.Université de Montréal and Centre de Recherche du CHU Sainte-JustineUniversité de MontréalMontréalCanada
  2. 2.Université de MontréalMontréalCanada
  3. 3.École de Technologie SupérieureMontrealCanada
  4. 4.Halton Catholic District School BoardBurlingtonCanada

Personalised recommendations