Skip to main content

A Problematic Leap in the Use of Test Data: From Performance to Inference

  • Chapter

Despite all the rhetoric about the new millennium, few assessment issues thus far belong exclusively to the 21st century. An issue spilling over from the 20th century is the demand for schools and teachers to use assessment information to improve student achievement and enhance educational systems more generally. Among the myriad possible mechanisms for improving student achievement through the efficient use of assessment information by schools and teachers is feedback to the student learning process along with enhancement of teachers’ pedagogical repertoires. When the assessment instrument is a standardised test, the product (student responses) gives information not only about what was learnt and how well it was learnt but also about what was not learnt and hints as to why this might be so.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • American Education Research Association, & American Psychological Association & National Council on Measurement of Education. (1985). Standards for Educational and Psychological Testing. Washington, DC: American Psychological Association.

    Google Scholar 

  • Allen, J. R. (2007). Fair measures of school performance. Paper presented at the 2007 ACACA Annual Conference, Melbourne.

    Google Scholar 

  • Bennett, R. E. (2006). Foreword. In G. N. Matters, Using data to support learning in schools: Students, teachers, systems. Camberwell: ACER Press.

    Google Scholar 

  • Bialecki, I. (2008). Assessment measurement and evaluation of literacy levels and of basic competencies in Poland. Paper delivered at UNESCO Regional Conference (Europe) in Support of Global Literacy, Baku, Azerbaijan.

    Google Scholar 

  • Biggs, J. B. (1993). From theory to practice: A cognitive systems approach. Higher Education Research and Development, 12, 73–86.

    Article  Google Scholar 

  • Biggs, J. B. (1999). Teaching for quality learning at university. Buckingham: SRHE & Open University Press.

    Google Scholar 

  • Biggs, J. B., & Moore, P. J. (1993). The process of learning. (3rd ed.). New York: Prentice Hall.

    Google Scholar 

  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Belmont, CA: Wadsworth.

    Google Scholar 

  • Cronbach, L. J. (1988). Five perspectives on validity argument. In H. Wainer (Ed.), Test validity. Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Ericson, D. P., & Ellett, F. S. (2002). The question of the student in educational reform. Educational Policy Analysis Archives, 10(31), Retrieved June 19, 2008, from <http://epaa.asu.edu/epaa/v10n31/>.

  • Goldstein, H. (2001). Using pupil performance data for judging schools and teachers: Scope and limitations. British Educational Research Journal, 27((4), 433–442.

    Article  Google Scholar 

  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.

    Google Scholar 

  • Hattie, J. A. C. (2005). What is the nature of evidence that makes a difference to learning? Paper delivered at the ACER Conference, Melbourne.

    Google Scholar 

  • Holland, P. W., & Wainer, H. (Eds.). (1993). Differential item functioning. Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Kingsbury, G. G., & Houser, R. (1997). Using data from a level testing system to change a school district. In J. O’Reilly (Ed.), The Rasch tiger ten years later: Using IRT techniques to measure achievement in schools. Chicago: National Association of Test Directors.

    Google Scholar 

  • Linn, R. L. (Ed.). (1989). Educational measurement (3rd edn.). Washington, DC: The American Council on Education and the National Council on Measurement in Education.

    Google Scholar 

  • Marsh, H. W. (1990). A multidimensional, hierarchical model of self-concept: Theoretical and empirical justification. Educational Psychology Review, 2(2), 77–172.

    Article  Google Scholar 

  • Matters, G. N. (1997). Are Australian boys underachieving? Paper presented at 23rd annual conference of the International Association for Assessment in Education. Durban, South Africa.

    Google Scholar 

  • Matters, G. N. (2006). Using data to support learning in schools: Students, teachers, systems. Camberwell: ACER Press.

    Google Scholar 

  • Matters, G. N., Allen, J. R., Gray, K. R., & Pitman, J. A. (1999). Can we tell the difference and does it matter? Differences in achievement between girls and boys in Australian senior secondary education. The Curriculum Journal, 10(2), 283–302.

    Google Scholar 

  • Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A Brief Introduction to Evidence-Centered Design. Retrieved June 19, 2008, from <http://www.ets.org/Media/Research/pdf/RR-03-16.pdf>.

  • Moss, P. A. (1992). Shifting conceptions of validity in educational measurement: Implications for performance assessment. Review of Educational Research, 62(3), 229–258.

    Google Scholar 

  • Moss, P. A. (1994). Can there be validity without reliability? Educational Researcher, 23(2), 5–12.

    Google Scholar 

  • OECD (Organisation for Economic Co-operation and Development). (2001). Knowledge and skills for life: First results from PISA 2000. Paris: Author.

    Google Scholar 

  • Rochex, J. Y. (2006). Social, methodological, and theoretical issues regarding assessment: lessons from a secondary analysis of PISA 2000 Literacy Tests. Review of Research in Education, 30(1), 163–212.

    Article  Google Scholar 

  • Rowe, K. J. (2006). Evidence for the kinds of feedback data that support both student and teacher learning. Paper delivered at the ACER Conference, Melbourne.

    Google Scholar 

  • Sarason, I. G. (1984). Stress, anxiety, and cognitive interference: Reactions to tests. Journal of Personality and Social Psychology, 46(4), 929–938.

    Article  Google Scholar 

  • Stage, C. (1994). Gender differences on the SweSAT: A review of studies since 1975. Department of Educational Measurement, Umeå University, EM No. 7.

    Google Scholar 

  • Thomson, S., & De Bortoli, L. (2008). Exploring scientific literacy: How Australia measures up. The PISA 2006 survey of students’ scientific, reading and mathematical literacy skills. Camberwell: ACER Press.

    Google Scholar 

  • Thomson, S., Cresswell, J., & De Bortoli, L. (2004). Facing the future: A focus on mathematical literacy among Australian 15-year-old students in PISA 2003. Camberwell: ACER Press.

    Google Scholar 

  • Willingham, W. W., & Cole, N. S. (1997). Gender and fair assessment. Princeton, NJ: Lawrence Erlbaum.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabrielle Matters .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer Science+Business Media B.V.

About this chapter

Cite this chapter

Matters, G. (2009). A Problematic Leap in the Use of Test Data: From Performance to Inference. In: Wyatt-Smith, C., Cumming, J.J. (eds) Educational Assessment in the 21st Century. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-9964-9_11

Download citation

Publish with us

Policies and ethics