Advertisement

The Impact of Using Coherent Curriculum on Students’ Understanding of Core Ideas in Chemistry

  • Namsoo ShinEmail author
  • Sung-Youn Choi
  • Shawn Y. Stevens
  • Joseph S. Krajcik
Article

Abstract

A coherent curriculum that clearly defines sets of ideas that build upon one another to meet desired learning goals can successfully support the development of meaningful understanding across time. This study explores the influence of a coherent curriculum as a systematic approach to aligning and sequencing specific ideas to support students in developing deeper knowledge of chemistry core ideas by comparing students in classrooms that used coherent instructional materials with those that used more traditional materials. We followed approximately 1225 middle school students from 6 schools in 4 different states over 4 time points, across 2 school years, and across 3 grade levels. 165 assessment items were used to measure student understanding along a learning progression, which provides a guide for how students may move toward more sophisticated understanding over time. Students’ estimated ability parameters obtained from item response theory analysis were used to compare the achievement between the coherent and traditional curriculum groups. Hierarchical linear model analysis was employed to examine students’ learning growth rates between different curriculum materials by grade and school. The results revealed that there are differences in the trends of learning between the coherent and traditional group by grade and school performance level. The coherent curriculum shows a promising effect on student learning, particularly in high- and middle-performing schools. However, there are limits to using coherent curriculum on a low-performing school. This study concluded with a discussion of challenges and promises for using coherent curriculum materials to support student learning.

Keywords

Coherent curriculum Learning progression Middle school Science 

Notes

Funding

This research was supported by a National Science Foundation grant (DRL 0822038).

Supplementary material

10763_2017_9861_MOESM1_ESM.docx (139 kb)
ESM 1 (DOCX 139 kb)

References 

  1. American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York, NY: Oxford University Press.Google Scholar
  2. American Association for the Advancement of Science. (2007). Atlas of science literacy (Vol. 1 & 2). Washington, DC: AAAS Press.Google Scholar
  3. Ausubel, D. P. (1960). The use of advance organizers in the learning and retention of meaningful verbal material. Journal of Educational Psychology, 51(5), 267–272.  https://doi.org/10.1037/h0046669.CrossRefGoogle Scholar
  4. Ausubel, D. P. (2000). The acquisition and retention of knowledge: A cognitive view. Dordrecht, The Netherlands: Kluwer Academic Publishers.Google Scholar
  5. Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, DC: The National Academy Press.Google Scholar
  6. Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academies Press.Google Scholar
  7. Fortus, D. & Krajcik, J. S. (2012). Curriculum coherence and learning progressions. In B. J. Fraser, K. G. Tobin, & C. J. McRobbie (Eds), Second international handbook of science education (pp. 783-798). Dordrecht: Springer. Google Scholar
  8. Fortus, D., Sutherland, L. M., Reiser, B. J., & Krajcik, J. S. (2015). Assessing the role of curriculum coherence in student learning about energy. Journal of Research in Science Teaching, 52(10), 1408–1425.  https://doi.org/10.1002/tea.21261.
  9. Kesidou, S., & Roseman, J. E. (2002). How well do middle school science programs measure up? Findings from Project 2061’s curriculum review. Journal of Research in Science Teaching, 39(6), 522–549.  https://doi.org/10.1002/tea.10035.CrossRefGoogle Scholar
  10. Krajcik, J. S., Codere, S., Dahsah, C., Bayer, R., & Mun, K. (2014). Planning Instruction to Meet the Intent of the Next Generation Science Standards. Journal of Science Teacher Education, 25(2), 157–175.Google Scholar
  11. Krajcik, J. S., McNeill, K. L., Reiser, B. (2008) Learning-goals-driven design model: Developing curriculum materials that align with national standards and incorporate project-based pedagogy. Science Education, 92(1), 1–32.Google Scholar
  12. Krajcik, J. S., Reiser, B., Sutherland, L.M., & Fortus, D. (2011). IQWST: Investigating and questioning our world through science and technology, (Middle School Science Curriculum Materials). Sangari Global Education/Active Science, USA.Google Scholar
  13. Krajcik, J. S., & Shin, N. (2014). Project-based learning. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (2nd ed., pp. 275–297). New York: Cambridge University Press.Google Scholar
  14. Krajcik, J. S., Sutherland, L. M., Drago, K., & Merritt, J. (2012). The promise and value of learning progression research. In S. Bernholt, K. Neumann, & P. Nentwig (Eds.), Making it Tangible - Learning Outcomes in Science Education (pp. 261–284). Münster: Waxmann.Google Scholar
  15. Margel, H., Eylon, B.-S., & Scherz, Z. (2008). A longitudinal study of junior high school students’ conceptions of the structure of materials. Journal of Research in Science Teaching, 45(1), 132–152.  https://doi.org/10.1002/tea.20214.CrossRefGoogle Scholar
  16. Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47(2), 149–174.  https://doi.org/10.1007/BF02296272.CrossRefGoogle Scholar
  17. McNamara, T. F. (1996). Measuring second language performance. Harlow, England: Addison Wesley Longman.Google Scholar
  18. Michaels, S., Shouse, A. W., & Schweingruber, H. A. (2008). Ready, set, science! Putting research to work in K-8 science classrooms. Washington, DC: The National Academies Press.Google Scholar
  19. National Research Council. (1996). National science education standards: Observe, interact, change, learn. Washington, DC: The National Academy Press.Google Scholar
  20. National Research Council. (2006). Systems for state science assessment: Committee on test design for K–12 science achievement. Washington, DC: The National Academies Press.Google Scholar
  21. National Research Council. (2009). Learning science in informal environments: People, places, and pursuits. Washington, DC: The National Academies Press.Google Scholar
  22. National Research Council. (2000). Inquiry and the national science education standards: A guide for teaching and learning. Washington, DC: The National Academies Press.Google Scholar
  23. National Research Council. (2012a). Education for life and work: Developing transferable knowledge and skills in the 21 st century. Washington, DC: The National Academies Press.Google Scholar
  24. National Research Council. (2012b). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: The National Academies Press.Google Scholar
  25. NGSS Lead States. (2013). Next generation science standards: For states, by states. Washington, DC: The National Academies Press.Google Scholar
  26. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods. Thousand Oaks, CA: Sage Publications.Google Scholar
  27. Roseman, J. E., Linn, M. C., & Koppal, M. (2008). Characterizing curriculum coherence. In Y. Kali, M. C. Linn, & J. E. Roseman (Eds.), Designing coherent science education: Implications for curriculum, instruction, and policy (pp. 13–36). New York, NY: Teachers College Press.Google Scholar
  28. Roseman, J. E., Stern, L., & Koppal, M. (2010). A method for analyzing the coherence of high school biology textbooks. Journal of Research in Science Teaching, 47(1), 47–70.  https://doi.org/10.1002/tea.20305.CrossRefGoogle Scholar
  29. Ruiz-Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation of systemic science education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39(5), 369–393.  https://doi.org/10.1002/tea.10027.CrossRefGoogle Scholar
  30. Scalise, K., & Gifford, B. (2006). Computer-based assessment in e-learning: A framework for constructing “intermediate constraint” questions and tasks for technology platforms. Journal of Technology, Learning, and Assessment, 4(6), 4–45.Google Scholar
  31. Schmidt, W. H., Wang, H. C., & McKnight, C. C. (2005). Curriculum coherence: An examination of U.S. mathematics and science content standards from an international perspective. Journal of Curriculum Studies, 37(5), 525–559.  https://doi.org/10.1080/0022027042000294682.CrossRefGoogle Scholar
  32. Shin, N., Stevens, S. Y., & Krajcik, J. (2010). Tracking student learning over time using Construct-Centered Design. In S. Routledge (Ed), Using analytical frameworks for classroom research: Collecting data and analysing narrative (pp. 38–58). London: Taylor & Francis.Google Scholar
  33. Shwartz, Y., Weizman, A., Fortus, D., Krajcik, J., & Reiser, B. (2008). The IQWST experience: Using coherence as a design principle for a middle school science curriculum. The Elementary School Journal, 109(2), 199–219.Google Scholar
  34. Smith, C. L., Wiser, M., Anderson, C. W., & Krajcik, J. S. (2006). Implications of research on children's learning for standards and assessment: A proposed learning progression for matter and the atomic-molecular theory. Measurement: Interdisciplinary Research & Perspective, 4(1), 1–98.Google Scholar

Copyright information

© Ministry of Science and Technology, Taiwan 2017

Authors and Affiliations

  • Namsoo Shin
    • 1
    Email author
  • Sung-Youn Choi
    • 2
  • Shawn Y. Stevens
    • 3
  • Joseph S. Krajcik
    • 4
  1. 1.College of EducationMichigan State UniversityEast LansingUSA
  2. 2.Dongguk UniversitySeoulSouth Korea
  3. 3.University of MichiganAnn ArborUSA
  4. 4.CREATE for STEM Institute, Michigan State UniversityEast LansingUSA

Personalised recommendations