Research in Science Education

, Volume 45, Issue 1, pp 41–58 | Cite as

The Challenge of Evaluating Students’ Scientific Literacy in a Writing-to-Learn Context



This paper reports on the challenge of evaluating students’ scientific literacy in a writing-to-learn context, as illustrated by our experience with an online science-writing project. In this mixed methods study, year 9 students in a case study class (13–14 year olds, n = 26) authored a series of two ‘hybridised’ short stories that merged scientific and narratives genres about the socioscientific issue of biosecurity. In seeking to measure the efficacy of the intervention, we sought evidence of students’ conceptual understanding communicated through their stories. Finding a suitable instrument presented our first challenge. This led to the development of scoring matrices to evaluate students’ derived sense of scientific literacy. Student interviews were also conducted to explore their understanding of concepts related to the biosecurity context. While the results of these analyses showed significant improvements in students’ understanding arising from their participation in the writing tasks, the interviews highlighted a second challenge in evaluating students’ scientific literacy: a disparity between their written and vocalised understandings. The majority of students expressed a deeper level of conceptual understanding during the interviews than they did in their stories. The interviews also revealed alternative conceptions and instances of superficial understanding that were not expressed in their writing. Aside from the methodological challenge of analysing stories quantitatively, these findings suggest that in a writing-to-learn context, evaluating students’ scientific literacy can be difficult. An examination of these artefacts in combination with interviews about students’ written work provided a more comprehensive evaluation of their developing scientific literacy. The implications of this study for our understanding of the derived sense of scientific literacy, as well as implications for classroom practice, are discussed.


Scientific literacy Assessment Writing-to-learn Narrative Learning science Socioscientific issues 


  1. Black, P. (2013). Pedagogy in theory and in practice: formative and summative assessments in classrooms and in systems. In D. Corrigan, R. Gunstone, & A. Jones (Eds.), Valuing assessment in science education: pedagogy, curriculum, policy (pp. 207–229). Netherlands: Springer.CrossRefGoogle Scholar
  2. Bybee, R. W. (1997). Achieving scientific literacy: from purposes to practices. Portsmouth: Heinemann.Google Scholar
  3. Champagne, A. B., & Newell, S. T. (1992). Directions for research and development: alternative methods of assessing scientific literacy. Journal of Research in Science Teaching, 29, 841–860.CrossRefGoogle Scholar
  4. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Mahwah: Lawrence Erlbaum.Google Scholar
  5. Eisenhart, M., Finkel, E., & Marion, S. F. (1996). Creating the conditions for scientific literacy: a re-examination. American Educational Research Journal, 33, 261–295.CrossRefGoogle Scholar
  6. Erzberger, C., & Kelle, U. (2003). Making inferences in mixed methods: the rules of integration. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 457–488). Thousand Oaks: Sage Publications, Inc.Google Scholar
  7. Fensham, P. J. (2009). Real world contexts in PISA Science: implications for context-based education. Journal of Research in Science Teaching, 46, 884–896.CrossRefGoogle Scholar
  8. Fensham, P. J. (2013). International assessments of science learning: their positive and negative contributions to science education. In D. Corrigan, R. Gunstone, & A. Jones (Eds.), Valuing assessment in science education: pedagogy, Curriculum, Policy (pp. 11–31). Netherlands: Springer.CrossRefGoogle Scholar
  9. Fensham, P. J., & Bellocchi, A. (2013). Higher order thinking in chemistry curriculum and its assessment. Thinking Skills and Creativity, 10, 250–264.CrossRefGoogle Scholar
  10. Fensham, P. J., & Rennie, L. J. (2013). Towards an authentically assessed science curriculum. In D. Corrigan, R. Gunstone, & A. Jones (Eds.), Valuing assessment in science education: pedagogy, Curriculum, Policy (pp. 69–100). Netherlands: Springer.CrossRefGoogle Scholar
  11. Glynn, S. M., & Muth, K. D. (1994). Reading and writing to learn science: achieving scientific literacy. Journal of Research in Science Teaching, 31, 1057–1073.CrossRefGoogle Scholar
  12. Hackling, M. W., Goodrum, D., & Rennie, L. J. (2001). The state of science in Australian secondary schools. Australian Science Teachers Journal, 47(4), 6–17.Google Scholar
  13. Hand, B., Prain, V., Lawrence, C., & Yore, L. (1999). A writing science framework designed to enhance science literacy. International Journal of Science Education, 21, 1021–1035.CrossRefGoogle Scholar
  14. Kulgemeyer, C., & Schecker, H. (2013). Students explaining science—assessment of science communication competence. Research in Science Education, 43(6), 2235–2256.CrossRefGoogle Scholar
  15. Langer, J. A., & Applebee, A. N. (1987). How writing shapes thinking: a study of teaching and learning. Research report no. 22. Urbana: National Council of Teachers of English.Google Scholar
  16. Laugksch, R. C. (2000). Scientific literacy: a conceptual overview. Science Education, 84, 71–94.CrossRefGoogle Scholar
  17. Lyons, T., & Quinn, F. (2010). Choosing science: understanding the declines in senior high school science enrolments. Research report to the Australian Science Teachers Association. Retrieved from
  18. Millar, R., & Osborne, J. (1998). Beyond 2000: science education for the future. London: King’s College London, School of Education.Google Scholar
  19. Mintzes, J. J., Wandersee, J. H., & Novack, J. D. (Eds.). (2005). Assessing science understanding: a human constructivist view. Burlington: Elsevier.Google Scholar
  20. Norris, S. P., & Phillips, L. M. (1994). The relevance of a reader’s knowledge within a perspectival view of reading. Journal of Reading Behaviour, 26, 391–412.Google Scholar
  21. Norris, S., & Phillips, L. (2003). How literacy in its fundamental sense is central to scientific literacy. Science Education, 87, 224–240.CrossRefGoogle Scholar
  22. Orpwood, G. (2007). Assessing scientific literacy: threats and opportunities. Paper presented at the Linnaeus Tercentenary 2007 Symposium, Uppsala University, Sweden.Google Scholar
  23. Prain, V. (2006). Learning from writing in secondary science: some theoretical and practical implications. International Journal of Science Education, 28, 179–201.CrossRefGoogle Scholar
  24. Roberts, D. A. (2007). Scientific literacy/science literacy. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 729–780). Mahwa: Lawrence Erlbaum Associates.Google Scholar
  25. Ritchie, S. M., Rigano, D. L., & Duane, A. (2008). Writing an ecological mystery in class: Merging genres and learning science. International Journal of Science Education, 30(2), 143–166.Google Scholar
  26. Ritchie, S. M., Tomas, L., & Tones, M. (2010). Writing stories to enhance scientific literacy. International Journal of Science Education, 33(5), 685–707.Google Scholar
  27. Roth, W.-M., & Barton, A. (2004). Rethinking scientific literacy. New York: Routledge Falmer.CrossRefGoogle Scholar
  28. Sadler, T. D. (2004). Moral and ethical dimensions of socioscientific decision-making as integral components of scientific literacy. Science Educator, 13, 39–48.Google Scholar
  29. Sadler, T. D. (2007). The aims of science education: unifying the fundamental and derived senses of scientific literacy. In C. Linder, L. Östman, & P. Wickman (Eds.), Promoting scientific literacy: science education research in transaction. Uppsala: Proceedings of the Linnaeus Tercentenary Symposium at Uppsala University.Google Scholar
  30. Sadler, T. D., & Zeidler, D. L. (2009). Scientific literacy, PISA, and socioscientific discourse: assessment for progressive aims of science education. Journal of Research in Science Teaching, 46, 909–921.CrossRefGoogle Scholar
  31. Southerland, S. A., Smith, M. U., & Cummins, C. L. (2005). “What do you mean by that?”: using structured interviews to assess science understanding. In J. J. Mintzes, J. H. Wandersee, & J. D. Novack (Eds.), Assessing science understanding: a human constructivist view (pp. 72–95). Burlington: Elsevier.Google Scholar
  32. Tomas, L. (2012). Writing narratives about socioscientific issues: Engaging students and learning science. Teaching Science, 58(4), 24-28.Google Scholar
  33. Tomas, L., & Ritchie, S. M. (2012). Positive emotional responses to hybridised writing about a socio-scientific issue. Research in Science Education, 42(1), 25–49.Google Scholar
  34. Tomas, L., Ritchie, S. M., & Tones, M. (2011). Attitudinal impact of hybridised writing about a socioscientific issue. Journal of Research in Science Teaching, 48(8), 878–900.Google Scholar
  35. Tytler, R. (2007). Re-imagining science education: engaging students in science for Australia’s future. Australian Council for Educational Research. Retrieved 20 July, 2007 from:
  36. Wallace, G. (1996). Engaging with learning. In J. Rudduck (Ed.), School improvement: what can pupils tell us? (pp. 56–69). London: David Fulton.Google Scholar
  37. White, R., & Gunstone, R. (1992). Probing understanding. London: Falmer.Google Scholar
  38. Yore, L. D., Bisanz, G. L., & Hand, B. M. (2003). Examining the literacy component of science literacy: 25 years of language arts and science research. International Journal of Science Education, 25, 689–25.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.James Cook UniversityTownsvilleAustralia
  2. 2.Murdoch UniversityMurdochAustralia

Personalised recommendations