Tiered Models of Integrated Academic and Behavioral Support: Effect of Implementation Level on Academic Outcomes

  • Amity Noltemeyer
  • Frank J. Sansosti
General Article


This exploratory study examined (a) Integrated Systems Model (ISM) implementation levels, and (b) the effect of implementation of the academic and behavioral components of ISM on student academic outcomes. Participants included 2,660 students attending six suburban elementary schools. Hierarchical linear regression was conducted using a control block of three school demographic variables (initial student oral reading fluency from one year prior, percentage of economically disadvantaged students, and percentage of non-minority students), and a block of three implementation variables (academic, behavioral, and overall implementation of ISM). A mean of 55% overall implementation was found, with higher implementation of the behavioral than the academic components of ISM. Three significant regression models were found, and a positive effect of academic implementation emerged. Limitations and implications are discussed.


RtI PBS Integrated Systems Model Implementation Integrity Reading Outcomes 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Alonzo, J., Tindal, G., & Robinson, Q. (2008). Using schoolwide response to intervention to close the achievement gap in reading. ERS Spectrum, 26, 1–9.Google Scholar
  2. Barrett, S.P., Bradshaw, C.P., & Lewis-Palmer, T. (2008). Maryland statewide PBIS initiative: Systems, evaluation, and next steps. Journal of Positive Behavior Interventions, 10, 105–114. doi:10.1177/1098300707312541CrossRefGoogle Scholar
  3. Batsche, G., Elliott, J., Graden, J.L., Grimes, J., Kovaleski, J.F., Prasse, D., … Tilly, W.D. (2006). Response to Intervention: Policy Considerations and Implementation. National Association of State Directors of Special Education Inc: Alexandria, VA.Google Scholar
  4. Berends, M., Bodilly, S., & Kirby, S. (2002). Facing the challenge of whole-school reform. Santa Monica, CA: RAND.Google Scholar
  5. Brace, N., Kemp, R., Snelgar, R. (2006). SPSS for Psychologists (Chapter 7, pp. 206–220). Mahwah, NJ: Lawrence Earlbaum Associates, Inc.Google Scholar
  6. Burns, M.K., Appleton, J.J., & Stehouwer, J. D. (2005). Meta-analytic review of responsiveness-to-intervention research: Examining field-based and research-implemented models. Journal of Psychoeducational Assessment, 23, 381–394. doi: 10.1177/073428290502300406CrossRefGoogle Scholar
  7. Datnow, A., Borman, G., & Stringfield, S. (2000). School reform through a highly specified curriculum: Implementation and effects of the core knowledge sequence. The Elementary School Journal, 101, 167–191. Retrieved from CrossRefGoogle Scholar
  8. Good, R.H., & Kaminski, R.A. (Eds.). (2002). Dynamic Indicators of Basic Early Literacy Skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Retrieved from Google Scholar
  9. Graden, J.L., Stollar, S.A., & Poth, R.L. (2007). The Ohio integrated systems model: Overview and lessons learned. In S.R. Jimerson, M.K. Burns, & A. VanDerHeyden (Eds.), Handbook of Response to Intervention: The Science and Practice of Assessment and Intervention. NewYork: Springer Science + Business Media, LLC.Google Scholar
  10. Gresham, F.M., Gansle, K.A., Noell, G.H., Cohen, S., & Rosenblum, S. (1993). Treatment integrity of school-based behavioral intervention studies: 1980–1990. School Psychology Review, 22(2), 254–272. Retrieved from Google Scholar
  11. Implementation Evaluation Tool Manual. (2007). Retrieved on October 18, 2008 from Integrated Systems/IETConsumables.pdf
  12. Individuals with Disabilities Education Act of 2004 (IDEA). (2004). Public Law 108-446.Google Scholar
  13. Jimerson, S.R., Burns, M.K., & VanDerHeyden, A.M (2007). Response to intervention at school: The science and practice of assessment and intervention. In S. R. Jimerson, M.K. Burns, & A. VanDerHeyden (Eds.), Handbook of Response to Intervention: The Science and Practice of Assessment and Intervention. NewYork: Springer Science + Business Media, LLC.CrossRefGoogle Scholar
  14. Kellam, S.G., Mayer, L.S., Rebok, G.W., & Hawkins, W.E. (1998). Effects of improving achievement on aggressive behavior and of improving aggressive behavior on achievement through two preventive interventions: An investigation of causal paths. In B.P. Dohrenwend (Ed.), Adversity, Stress, and Psychopathology (pp. 486–505). London: Oxford University Press.Google Scholar
  15. Lane, K.L., & Menzies, H.M. (2002). The effects of a school-based primary intervention program: Preliminary outcomes. Preventing School Failure, 47(1), 26–32. doi: 10.1080/10459880309604425CrossRefGoogle Scholar
  16. Lassen, S. R., Steele, M.M., & Sailor, W. (2006). The relationship of school-wide positive behavior support to academic achievement in an urban middle school. Psychology in the Schools, 43, 701–712. doi: 10.1002/pits.20177CrossRefGoogle Scholar
  17. Lewis, T.J., & Sugai, S. (1999). Effective Behavior Support: A Systems Approach to Proactive Schoolwide Management. Focus on Exceptional Children, 31, 1–24. Retrieved from Google Scholar
  18. Mann, D. (1978). Making Educational Change Happen. New York: Teachers College Press.Google Scholar
  19. McGlinchey, M.T., & Goodman, S. (2008). Best practices in implementing school reform. In A. Thomas and J. Grimes (Eds.), Best Practices in School Psychology V (pp. 983–994). Bethesda, MD: National Association of School Psychologists.Google Scholar
  20. McIntosh, K., D.J., Chard, J.B., Boland, B., & Horner, R.H. (2006). Demonstration of combined efforts in a school-wide academic and behavioral systems and incidence of reading and behavior challenges in early elementary grades. Journal of Positive Behavior Interventions, 8, 146–154. doi:10.1177/10983007060080030301CrossRefGoogle Scholar
  21. No Child Left Behind Act (NCLB). (2001). Public Law 107-110.Google Scholar
  22. Noltemeyer, A., & Mcloughlin, C.S. (2010). Patterns of exclusionary discipline by school typology, ethnicity, and their interaction. Perspectives on Urban Education, 7, 27–40. Retrieved from Google Scholar
  23. Nunnery, J. (1997). Effects of Full and Partial Implementation of Success for All on Student Reading Achievement in English and Spanish. Paper presented at the Annual Meeting of the American Educational Research Association (Chicago, IL, March 24-28, 1997). Abstract retrieved from
  24. OSEP Center on Positive Behavior Interventions and Supports. (2004). Schoolwide Positive Behavior Support Implementers’ Blueprint and Self-Assessment (H326S980003). Eugene, OR: Author. Retrieved from Google Scholar
  25. Snyder, T.D., Dillow, S.A., and Hoffman, C.M. (2008). Digest of Education Statistics 2007 (NCES 2008-022). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education: Washington, DC. Retrieved from Google Scholar
  26. State Improvement Grant Steering Committee. (2007). Implementation Evaluation Tool. Retrieved October 18, 2008 from
  27. Stewart, R.M., Benner, G.J., Martella, R.C., & Marchand-Martella, N.E. (2007). Three-tier models of reading and behavior: A research review. Journal of Positive Behavior Interventions, 9, 239–253. doi: 10.1177/10983007070090040601CrossRefGoogle Scholar
  28. Stollar, S.A., Poth, R.L., Curtis, M.J., & Cohen, R.M. (2006). Collaborative strategic planning as illustration of the principles of systems change. School Psychology Review, 35, 181–197. Retrieved from Google Scholar
  29. Sugai, G., Lewis-Palmer, T., Todd, A., & Horner, R.H. (2001). School-Wide Evaluation Tool. Eugene, OR: Educational and Community Supports, University of Oregon. Retrieved from Google Scholar
  30. Telzrow, C.F., McNamara, K., & Hollinger, C.L. (2000). Fidelity of problem-solving implementation and relationship to student performance. School Psychology Review, 29, 443–461. Retrieved from Google Scholar
  31. Tindal, G., Marston, D., & Deno, S.L. (1983). The reliability of direct and repeated measurement (Research Report No. 109). Minneapolis, MN: University of Minnesota Institute for Research on Learning Disabilities.Google Scholar
  32. University of Oregon Center on Teaching and Learning. (2006). DIBELS Benchmark Goals: Three Assessment Periods Per Year. Retrieved from
  33. University of Oregon Center on Teaching and Learning. (2008). DIBELS Measures. Retrieved October 12, 2008 from
  34. Urban Institute. (2007). Baltimore city’s high school reform initiative: Schools, students, and outcomes. Washington, DC: Urban Institute. Retrieved from Google Scholar
  35. Vander Meer, C.D., Lentz, F.E., & Stollar, S. (2005). The relationship between oral reading fluency and Ohio proficiency testing in reading (Technical Report). Eugene, OR: University of Oregon. Retrieved from Google Scholar
  36. Wickstrom, K.F., Jones, K.M., Lafleur, L.H., & Witt, J.C. (1998). An analysis of treatment integrity in school-based consultation. School Psychology Quarterly, 13, 141–154. doi:10.1037/h0088978CrossRefGoogle Scholar

Copyright information

© California Association of School Psychologists 2012

Authors and Affiliations

  1. 1.Department of Educational PsychologyMiami UniversityOxfordUSA
  2. 2.Kent State UniversityKentUSA

Personalised recommendations