Validation of an Instrument for Measuring Students’ Understanding of Interdisciplinary Science in Grades 4-8 over Multiple Semesters: a Rasch Measurement Study

Article

Abstract

So far, not enough effort has been invested in developing reliable, valid, and engaging assessments in school science, especially assessment of interdisciplinary science based on the new Next Generation Science Standards (NGSS). Furthermore, previous tools rely mostly on multiple-choice items and evaluation of student outcome is linked only to their raw scores in standardized tests. In addition, educational research on student science achievement always limits in certain grade/subject according to application of raw scores. This study provides a way to design, validate, and further improve an instrument to assess student understanding in interdisciplinary science across grade and time. Meanwhile, student learning growth is investigated. The data are collected from elementary/middle school student survey from an Interdisciplinary Science and Engineering Partnership (ISEP) in Northeastern part of the USA. The results show a generally good quality of instrument through various aspects of empirical evidence, such as dimensionality, model-data-fit, and validity. Finally, student learning growth in understanding interdisciplinary science shows a sharp increase between elementary and middle school. The study sheds light on developing more reliable and valid instruments in assessing student science understanding based on the new standards and it also provides suggestions of implementation in both educational research and practice.

Keywords

Crosscutting concepts Interdisciplinary science Instrument validation Learning growth Rasch measurement 

Notes

Acknowledgements

This paper is based upon work supported by the National Science Foundation under Grant No. DUE-1102998. Any opinions, findings, and conclusions or recommendations expressed in the materials are those of the authors and do not necessarily reflect the views of the National Science Foundation.

References

  1. Baker, T. R. & White, S. H. (2003). The effects of GIS on students’ attitudes, self-efficacy, and achievement in middle school science classrooms. Journal of Geography, 102(6), 243–254.CrossRefGoogle Scholar
  2. Bond, T. G. & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  3. Boone, W. J., Staver, J. R. & Yale, M. S. (2014). Rasch analysis in the human sciences. Dordrecht, The Netherlands: Springer.Google Scholar
  4. Britner, S. L. & Pajares, F. (2001). Self-efficacy beliefs, motivation, race, and gender in middle school science. Journal of Women and Minorities in Science and Engineering, 7(4), 271–285.Google Scholar
  5. Britton, E. D. & Schneider, S. A. (2007). Large-scale assessments in science education. In S. Abell & N. Lederman (Eds.), Handbook of research on science education (pp. 1007–1040). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  6. Chen, Y.-L., Pan, P.-R., Sung, Y.-T. & Chang, K.-E. (2013). Correcting misconceptions on electronics: Effects of a simulation-based learning environment backed by a conceptual change model. Journal of Educational Technology & Society, 16(2), 212–227.Google Scholar
  7. Czerniak, C. M. (2007). Interdisciplinary science teaching. In S. Abell & N. Lederman (Eds.), Handbook of research on science education (pp. 537–559). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  8. Davis-Kean, P. E. (2005). The influence of parent education and family income on child achievement: The indirect role of parental expectations and the home environment. Journal of Family Psychology, 19(2), 294–304.CrossRefGoogle Scholar
  9. Duit, R. & Treagust, D. F. (2003). Conceptual change: A powerful framework for improving science teaching and learning. International Journal of Science Education, 25(6), 671–688.CrossRefGoogle Scholar
  10. Fox, A. R. (2014). Examination of consistency on the Ohio Achievement Assessments and Ohio Graduation Test. Theses, Dissertations and Capstones, 873. Retrieved from http://mds.marshall.edu/etd/873.
  11. Haladyna, T. M. (2012). Developing and validating multiple-choice test items (3rd ed.). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  12. Hill, N. E. & Tyson, D. F. (2009). Parental involvement in middle school: A meta-analytic assessment of the strategies that promote achievement. Developmental Psychology, 45(3), 740–763.CrossRefGoogle Scholar
  13. Jeynes, W. H. (2007). The relationship between parental involvement and urban secondary school student academic achievement a meta-analysis. Urban Education, 42(1), 82–110.CrossRefGoogle Scholar
  14. Johnson, P. (1998). Progression in children’s understanding of a ‘basic’ particle theory: A longitudinal study. International Journal of Science Education, 20(4), 393–412.CrossRefGoogle Scholar
  15. Kahle, J. B., Meece, J. & Scantlebury, K. (2000). Urban African-American middle school science students: Does standards-based teaching make a difference? Journal of Research in Science Teaching, 37(9), 1019–1041.CrossRefGoogle Scholar
  16. Klassen, S. (2006). Contextual assessment in science education: Background, issues, and policy. Science Education, 90(5), 820–851.CrossRefGoogle Scholar
  17. Komarraju, M. & Nadler, D. (2013). Self-efficacy and academic achievement: Why do implicit beliefs, goals, and effort regulation matter? Learning and Individual Differences, 25, 67–72.CrossRefGoogle Scholar
  18. Lawson, A. E. (1993). Deductive reasoning, brain maturation, and science concept acquisition: Are they linked? Journal of Research in Science Teaching, 30(9), 1029–1051.CrossRefGoogle Scholar
  19. Lederman, N. G. (2007). Nature of science: Past, present, and future. In S. Abell & N. Lederman (Eds.), Handbook of research on science education (pp. 831–879). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  20. Lee, O. & Anderson, C. W. (1993). Task engagement and conceptual change in middle school science classrooms. American Educational Research Journal, 30(3), 585–610.CrossRefGoogle Scholar
  21. Linacre, J. M. (2002). Winsteps Manual. Chicago, IL: Winsteps.Google Scholar
  22. Liu, X. (2007). Elementary to high school students’ growth over an academic year in understanding concepts of matter. Journal of Chemical Education, 84(11), 1853–1856.Google Scholar
  23. Liu, X. (2010a). Essentials of science classroom assessment. London, United Kingdom: Sage Publications.Google Scholar
  24. Liu, X. (2010b). Using and developing measurement instruments in science education: A Rasch modeling approach. Charlotte, NC: Information Age Publishing.Google Scholar
  25. Liu, X. (2012). Using Learning Progression to Organize Learning Outcomes: Implications for Assessment. In S. Bernholt, K. Neumann, & P. Nentwig (Eds.), Making It Tangible - Learning Outcomes in Science Education (pp. 309–325). Münster, Germany: Waxmann.Google Scholar
  26. McClary, L. M. & Bretz, S. L. (2012). Development and assessment of a diagnostic tool to identify organic chemistry students’ alternative conceptions related to acid strength. International Journal of Science Education, 34(15), 2317–2341.CrossRefGoogle Scholar
  27. Meluso, A., Zheng, M., Spires, H. A. & Lester, J. (2012). Enhancing 5th graders’ science content knowledge and self-efficacy through game-based learning. Computers & Education, 59(2), 497–504.CrossRefGoogle Scholar
  28. Mintzes, J. J., Wandersee, J. H. & Novak, J. D. (2005). Assessing science understanding: A human constructivist view. San Diego, CA: Academic.Google Scholar
  29. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.Google Scholar
  30. National Research Council (2013). Next generation science standards: For states, by states. Washington, DC: National Academies Press.Google Scholar
  31. Nunnally, J. C., Bernstein, I. H. & Berge, J. M. T. (1967). Psychometric theory (1st ed.). New York, NY: McGraw-Hill.Google Scholar
  32. Osborne, J. & Dillon, J. (2008). Science education in Europe. A report to the Nuffield Foundation. London, United Kingdom: King’s College.Google Scholar
  33. Rasch, G. (1993). Probabilistic models for some intelligence and attainment tests. Retrieved from ERIC database (ED419814).Google Scholar
  34. Roediger, H. L., III & Marsh, E. J. (2005). The positive and negative consequences of multiple-choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(5), 1155–1159.Google Scholar
  35. Sesli, E. & Kara, Y. (2012). Development and application of a two-tier multiple-choice diagnostic test for high school students’ understanding of cell division and reproduction. Journal of Biological Education, 46(4), 214–225.CrossRefGoogle Scholar
  36. Smith, C. L., Wiser, M., Anderson, C. W. & Krajcik, J. (2006). Implications of research on children’s learning for standards and assessment: A proposed learning progression for matter and the atomic-molecular theory. Measurement: Interdisciplinary Research & Perspective, 4(1–2), 1–98.Google Scholar
  37. Smith, A. B., Rush, R., Fallowfield, L. J., Velikova, G. & Sharpe, M. (2008). Rasch fit statistics and sample size considerations for polytomous data. BMC Medical Research Methodology, 8(1), 1–33.CrossRefGoogle Scholar
  38. Stoddart, T., Abrams, R., Gasper, E. & Canaday, D. (2000). Concept maps as assessment in science inquiry learning-a report of methodology. International Journal of Science Education, 22(12), 1221–1246.CrossRefGoogle Scholar
  39. van Driel, J. H., Meirink, J., Van Veen, K. & Zwart, R. (2012). Current trends and missing links in studies on teacher professional development in science education: A review of design features and quality of research. Studies in Science Education, 48(2), 129–160.CrossRefGoogle Scholar
  40. Zeidler, D. & Sadler, T. (2009). Scientific literacy, PISA, and socioscientific discourse: Assessment for progressive aims of science education. Journal of Research in Science Teaching, 46(8), 909–921.CrossRefGoogle Scholar

Copyright information

© Ministry of Science and Technology, Taiwan 2017

Authors and Affiliations

  1. 1.Learning and InstructionUniversity at Buffalo, SUNYBuffaloUSA
  2. 2.Institution of Chemical EducationNortheast Normal UniversityChangchunPeople’s Republic of China

Personalised recommendations