Skip to main content

An Analysis of the Meaning and Use of Student Learning Objectives

  • Chapter
  • First Online:
Student Growth Measures in Policy and Practice

Abstract

In new state teacher evaluation systems, Student Learning Objectives (SLOs) have emerged to satisfy requirements that teachers in non-tested subjects be evaluated by student growth measures (SGMs). Many of these systems conceive of SLOs as satisfying two distinct purposes: supporting instructional improvement and evaluating personnel. Based on data from a broader analysis of Race to the Top (RttT) state implementations, this chapter reviews initial design efforts, describes SLO implementations, and considers the roles SLOs should play as part of teacher evaluation systems. We argue that SLOs must be re-conceptualized as a measure of teacher practice, the tremendous variation in measure design needs to be closely scrutinized, and state administrators need to be conscious of the purpose for which they are using SLOs so that they may appropriately design valid teacher evaluation systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Amrein-Beardsley, A., & Collins, C. (2012). The SAS Education Value-Added Assessment System (SAS® EVAAS®) in the Houston Independent School District (HISD): Intended and unintended consequences. Education Policy Analysis Archives, 20(12), 1–28.

    Google Scholar 

  • Archibald, S., Coggshall, J., Croft, A., & Goe, L. (2011). High-quality professional development for all teachers: Effectively allocating resources (National Comprehensive Center for Teacher Quality Report). Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Austin Independent School District (2014a). AISD REACH program update: Longitudinal student growth. Austin, TX: Austin Independent School District.

    Google Scholar 

  • Austin Independent School District (2014b). AISD REACH program: Summary of findings from 2007–2008 through 2012–2013. Austin, TX: Austin Independent School District.

    Google Scholar 

  • Baker, E. L. (2008). Learning and assessment in an accountability context. In K. Ryan & L. Shepard (Eds.), The future of test-based educational accountability. New York, NY: Routledge.

    Google Scholar 

  • Borko, H. (2004). Professional development and teacher learning: Mapping the terrain. Educational Researcher, 33(8), 3–15. Retrieved from http://www.jstor.org/stable/3699979

    Google Scholar 

  • Borko, H., Stecher, B., Alonzo, A., Moncure, S., & McClam, S. (2005). Artifact packages for characterizing classroom practice: A pilot study. Educational Assessment, 10(2), 73–104.

    Article  Google Scholar 

  • Braun, H., & Mislevy, R. (2004). Intuitive test theory (CSE Report 631). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing (CRESST), Center for the Study of Evaluation (CSE).

    Google Scholar 

  • Community Training and Assistance Center (2004). Catalyst for change: Pay for performance in Denver final report. Boston, MA: Community Training and Assistance Center.

    Google Scholar 

  • Community Training and Assistance Center (2013). It’s more than money: Teacher Incentive Fund Leadership for Educators’ Advanced Performance (TIF-LEAP) Charlotte-Mecklenburg schools. Boston, MA: Community Training and Assistance Center.

    Google Scholar 

  • Darling-Hammond, L., & Rustique-Forrester, E. (2005). The consequences of student testing for teaching and teacher quality. Yearbook of the National Society for the Study of Education, 104(2), 289–319.

    Article  Google Scholar 

  • Datnow, A., Park, V., & Kennedy-Lewis, B. (2012). High school teachers’ use of data to inform instruction. Journal of Education for Students Placed At Risk, 17(4), 247–265.

    Article  Google Scholar 

  • Feuer, M. (2008). Future directions for educational accountability: Notes for a political economy of measurement. In K. Ryan & L. Shepard (Eds.), The future of test-based educational accountability. New York, NY: Routledge.

    Google Scholar 

  • Finnigan, K., & Gross, B. (2007). Do accountability policy sanctions influence teacher motivation? Lessons from Chicago’s low-performing schools. American Educational Research Journal, 44(3), 594–630.

    Article  Google Scholar 

  • Garet, M., Porter, A., Desimone, L., Birman, B., & Yoon, K. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915–945.

    Article  Google Scholar 

  • Gill, B., English, B., Furgeson, J., & McCullough, M. (2014). Alternative student growth measures for teacher evaluation: Profiles of early-adopting districts (REL 2014–016). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Mid-Atlantic.

    Google Scholar 

  • Honig, M. (2006). Complexity and policy implementation: Challenges and opportunities for the field. In M. Honig (Ed.), New directions in education policy implementation: Confronting complexity (pp. 1–25). Albany, NY: State University of New York Press.

    Google Scholar 

  • Ikemoto, G., & Marsh, J. (2007). Cutting through the ‘data-driven’ mantra: Different conceptions of data-driven decision making. In P. Moss (Ed.), Evidence and decision making: 106th yearbook of the National Society for the Study of Education (pp. 105–131). Chicago, IL: The National Society for the Study of Education (NSSE).

    Google Scholar 

  • Kane, M. (2006). Validation. In R. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). New York, NY: American Council on Education, Macmillan Publishing.

    Google Scholar 

  • Lange, C., Range, B., & Welsh, K. (2012). Conditions for effective data use to improve schools: Recommendations for school leaders. International Journal of Educational Leadership Preparation, 7(3), 1–11.

    Google Scholar 

  • Lipsky, M. (1980). Street-level bureaucracy: Dilemmas of the individual in public services. New York, NY: Russell Sage Foundation.

    Google Scholar 

  • Marsh, J. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48.

    Google Scholar 

  • Matsumura, L., Garnier, H., Slater, S., & Boston, M. (2008). Toward measuring instructional interactions “at-scale.” Educational Assessment, 13(4), 267–300.

    Google Scholar 

  • Messick, S. (1989). Validity. In R. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). Washington, DC: American Council on Education.

    Google Scholar 

  • Miller, M. (2008). Data for school improvement and educational accountability: Reliability and validity in practice. In K. Ryan & L. Shepard (Eds.), The future of test-based educational accountability. New York, NY: Routledge.

    Google Scholar 

  • Mitchell, K., Shkolnik, J., Song, M., Uekawa, K., Murphy, R., Garet, M., et al. (2005). Rigor, relevance, and results: The quality of teacher assignments and student work in new and conventional high schools. Washington, DC: American Institutes for Research.

    Google Scholar 

  • National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. Pelligrino, J., Chudowsky, N., & Glaser, R., (Eds.). Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.

    Google Scholar 

  • Plecki, M., Alejano, C., Knapp, M., & Lochmiller, C. (2006). Allocating resources and creating incentives to improve teaching and learning. Seattle, WA: University of Washington, College of Education, Center for the Study of Teaching and Policy.

    Google Scholar 

  • Stecher, B. (2010). Performance assessment in an era of standards-based educational accountability. Stanford, CA: Stanford University, Stanford Center for Opportunity Policy in Education.

    Google Scholar 

  • U.S. Department of Education (2009). Overview information; Race to the Top fund; Notice inviting applications for new awards for fiscal year (FY) 2010. Federal Register, 75(71), 19496–19531.

    Google Scholar 

  • U.S. Department of Education. (2012). ESEA Flexibility. Washington, DC: Author.

    Google Scholar 

Download references

Acknowledgments

We would like to thank Christy Lyon and Katie Buckley for their careful read and helpful suggestions and Colleen McDermott for her outstanding editorial assistance.

 Portions of this research were supported by a grant from the American Federation of Teachers (AFT), but the content is the sole work of the authors.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Information Cited Regarding State SLO Systems Can Be Found in the Following Sources

Information Cited Regarding State SLO Systems Can Be Found in the Following Sources

  • Arizona Department of Education. (2014). Teacher evaluation: Implications for special educators leading change 2014. Stodola, V., Ahumada, A., & Aaroe, L. Phoenix, AZ: Author.

  • Arizona Department of Education. (2015).The student learning objective handbook: The student learning objective process. Phoenix, AZ: Author.

  • Colorado Department of Education. (2014a). 20142015 users guide: Colorado state model educator evaluation system. Denver, CO: Author.

  • Colorado Department of Education. (2014b). Guidance: Measures of Student Learning in Teacher Evaluation (version 2.1). Denver, CO: Author.

  • Colorado Education Initiative. (2014). Colorado MSL systems: Patterns and progress 20132014. Denver, CO: Author.

  • Delaware Department of Education. (2014). Delaware Performance Appraisal System: Building greater skills and knowledge for educators. DPAS-II guide (revised) for teachers. Dover, DE: Author.

  • District of Columbia Public Schools. (2014). General education teachers with individual value-added student achievement data. Washington, DC: Author.

  • Georgia Department of Education. (2014a). Student learning objectives operations manual. Atlanta, GA: Author.

  • Georgia Department of Education. (2014b). Student learning objectivesroles and responsibilities. Atlanta, GA: Author.

  • Hawaii Department of Education. (2014a). Hawaii Educator Effectiveness System: Manual for administrators, evaluators and teachers. Honolulu, HI: Author.

  • Hawaii Department of Education. (2014b). Student Learning Objectives (SLO) technical guidance and planning document 20142015. Honolulu, HI: Author.

  • Illinois State Board of Education. (2013). Guidance document 13-06: Measuring student growth using type III assessments. Springfield, IL: Author.

  • Illinois State Board of Education. (2015). Student Learning Objective guidebook. Springfield, IL: Author.

  • Kentucky Department of Education. (2015). Model certified evaluation plan 5.0. Frankfort, KY: Author.

  • Louisiana Department of Education. (2014a). 2013–2014 COMPASS report summary. Baton Rouge, LA: Author.

  • Louisiana Department of Education. (2014b). Bulletin 130: Regulations for the evaluation and assessment of school personnel. Baton Rouge, LA: Author.

  • Maryland State Department of Education. (2012). Student learning objectives. In Maryland State model for educator effectiveness (pp. 78–107). Baltimore, MD: Author.

  • Maryland State Department of Education. (2013). Maryland teacher and principal evaluation guidebook. Baltimore, MD: Author.

  • Massachusetts Department of Elementary and Secondary Education. (2012). Massachusetts model system for educator evaluation: Part VII: Rating educator impact on student learning using district-determined measures of student learning, growth and achievement. Malden, MA: Author.

  • New Jersey Department of Education. (2013a). AchieveNJ: Educator evaluation and support in New Jersey. Trenton, NJ: Author.

  • New Jersey Department of Education. (2013b). Student growth objectives: Developing and using practical measures of student learning. Trenton, NJ: Author.

  • New York State Education Department. (2014). Guidance on the New York State district-wide growth goal-setting process for teachers: Student learning objectives. Albany, NY: Author.

  • Ohio Department of Education. (2015). Student learning objectives: A guide to using SLOs as a locally determined measure of student growth. Columbus, OH: Author.

  • Public Schools of North Carolina. (2013). Measures of student learning: NCs common exams. A guide to the administration of assessments from North Carolinas library of common exams. Raleigh, NC: Author.

  • Public Schools of North Carolina (n.d.). Teacher effectiveness data for the 2013-14 school year. Raleigh, NC: Author.

  • Rhode Island Department of Education. (2013). Addendum to the Rhode Island model teacher evaluation & support system. Providence, RI: Author.

  • Rhode Island Department of Education. (2014a). Measures of student learning (revised). Providence, RI: Author.

  • Rhode Island Department of Education. (n.d.). Rhode Island model evaluation & support system (Edition III). Providence, RI: Author.

Copyright information

© 2016 The Author(s)

About this chapter

Cite this chapter

Crouse, K., Gitomer, D.H., Joyce, J. (2016). An Analysis of the Meaning and Use of Student Learning Objectives. In: Kappler Hewitt, K., Amrein-Beardsley, A. (eds) Student Growth Measures in Policy and Practice. Palgrave Macmillan, New York. https://doi.org/10.1057/978-1-137-53901-4_11

Download citation

  • DOI: https://doi.org/10.1057/978-1-137-53901-4_11

  • Published:

  • Publisher Name: Palgrave Macmillan, New York

  • Print ISBN: 978-1-137-53900-7

  • Online ISBN: 978-1-137-53901-4

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics