Advertisement

Research in Science Education

, Volume 47, Issue 1, pp 49–66 | Cite as

Bringing CASE in from the Cold: the Teaching and Learning of Thinking

  • Mary Oliver
  • Grady Venville
Article

Abstract

Thinking Science is a 2-year program of professional development for teachers and thinking lessons for students in junior high school science classes. This paper presents research on the effects of Thinking Science on students’ levels of cognition in Australia. The research is timely, with a general capability focused on critical thinking in the newly implemented F-10 curriculum in Australia. The design of the research was a quasi-experiment with pre- and post-intervention cognitive tests conducted with participating students (n = 655) from nine cohorts in seven high schools. Findings showed significant cognitive gains compared with an age-matched control group over the length of the program. Noteworthy is a correlation between baseline cognitive score and school Index of Community Socio-Educational Advantage (ICSEA). We argue that the teaching of thinking be brought into the mainstream arena of educational discourse and that the principles from evidence-based programs such as Thinking Science be universally adopted.

Keywords

Thinking skills Metacognition Cognitive conflict Pedagogy 

Notes

Acknowledgments

This research was supported by a grant from the Australian Research Council (DP1093877). The ideas presented in the paper are those of the authors and not the funding institution. We acknowledge the late Professor Philip Adey whose wisdom and encouragement over the years of this research was invaluable.

References

  1. ACARA (Australian Curriculum, Assessment and Reporting Authority). (2012). The Australian Curriculum v4.0. http://www.australiancurriculum.edu.au/GeneralCapabilities/Critical-and-creative%20-thinking/. Accessed 10 Apr 2014.
  2. Adey, P. (2005). Issues arising from the long-term evaluation of cognitive acceleration programs. Research in Science Education, 35, 3–22.CrossRefGoogle Scholar
  3. Adey, P. (2012). From fixed IQ to multiple intelligences. In P. Adey & J. Dillon (Eds.), Bad education: debunking myths in education (pp. 199–214). Maidenhead: Open University.Google Scholar
  4. Adey, P., & Shayer, M. (1990). Accelerating the development of formal thinking in middle and high school students. Journal of Research in Science Teaching, 27(3), 267–285.CrossRefGoogle Scholar
  5. Adey, P., & Shayer, M. (1994). Really raising standards: cognitive intervention and academic achievement. London: Routledge.Google Scholar
  6. Adey, P., Shayer, M., & Yates, C. (2001). Thinking science: the curriculum materials of the CASE project (3rd ed.). London: Nelson Thornes.Google Scholar
  7. Adey, P., Robertson, A., & Venville, G. (2002). Effects of a cognitive acceleration programme on year 1 pupils. British Journal of Educational Psychology, 72, 1–25.CrossRefGoogle Scholar
  8. Allen, P., & Bennett, K. (2008). SPSS for the health and behavioural sciences. Melbourne, Australia: Thomson.Google Scholar
  9. Andrews, D. (2012). In search of feasible fidelity. Better Evidence-based Education, 4(2), 22–23.Google Scholar
  10. Andrich, D., & Styles, I. (1994). Psychometric evidence of intellectual growth spurts in early adolescence. Journal of Early Adolescence, 14(3), 328–344.Google Scholar
  11. Babai, R., & Levit-Dori, T. (2009). Several CASE lessons can improve students’ control of variables reasoning scheme ability. Journal of Science Education and Technology, 18(5), 439–446.CrossRefGoogle Scholar
  12. Backwell, J. L., & Hamaker, T. (2004). Cognitive Acceleration through Technology Education (CATE): Implications for Teacher Education. e-Proceedings of epiSTEME-1 conference, December 13–17 2004, Goa, India. http://www.education.uwa.edu.au/tsa/data/assets/pdf_file/0016/1025026/CATE_implications_for_teacher_education_.pdf.
  13. Bao, L., Cai, T., Koenig, K., Fang, K., Han, J., Wang, J., et al. (2009). Physics: learning and scientific reasoning. Science, 323(5914), 586–587.Google Scholar
  14. Burke, L. A., & Williams, J. M. (2008). Developing young thinkers: an intervention aimed to enhance children’s thinking skills. Thinking Skills and Creativity, 3, 104–124.Google Scholar
  15. Cohen, J. (1988). Statistical power analysis for the behavioural sciences (2nd ed.). Hillsdale, NJ: Erlbaum.Google Scholar
  16. Connor, C. M., Alberto, P. A., Compton, D. L., & O’Connor, R. E. (2014). Improving reading outcomes for students with or at risk for reading disabilities: a synthesis of the contributions from the institute of education sciences research centers. In N. 2014–3000). (Ed.). Washington, DC: National Center for Special Education Research, Institute of Education Sciences, U.S. Department of Education.Google Scholar
  17. Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(6031), 862–864.CrossRefGoogle Scholar
  18. Dosenbach, N. U. F., et al. (2010). Prediction of individual brain maturity using fMRI. Science, 329(5997), 1358–1361.CrossRefGoogle Scholar
  19. Endler, L., & Bond, T. (2000). Changing science outcomes: cognitive acceleration in a US setting. Research in Science Education 38(2), 149–166.Google Scholar
  20. Endler, L. C., & Bond, T. G. (2006). Tracking cognitive development with the rasch model: Empirical evidence of growth and heterogeneity. In X. Liu & J. William (Eds.), Applications of rasch measurement in science education (pp. 74–110). Maple Grove: Jam Press.Google Scholar
  21. Frith, C. D. (2012). The role of metacognition in human social interactions. Philosophical Transactions of the Royal Society, B: Biological Sciences, 367(1599), 2213–2223. doi: 10.1098/rstb.2012.0123.CrossRefGoogle Scholar
  22. Gallagher, A. (2008). Developing thinking with four and five year old pupils: the impact of a cognitive acceleration programme through early science skill development. Master’s Thesis., Dublin City University.Google Scholar
  23. Gouge, K., & Yates, C. (2002). Creating a cognitive acceleration programme in the Arts: the Wigan LEA Arts project. In M. Shayer & P. Adey (Eds.), Learning intelligence. Cognitive acceleration across the curriculum from 5 to 15 years. Philadelphia: Open University.Google Scholar
  24. Hargreaves, A., & Fullan, M. (2012). Professional capital: transforming teaching in every school. Abingdon, Oxon: Routledge.Google Scholar
  25. Hattie, J. (2009). Visible learning: a synthesis of over 800 meta-analyses relating to achievement. London: Routledge.Google Scholar
  26. Hautamäki, J., Kuusela, J., & Wikström, J. (2002, July). CASE and CAME in Finland: “The second wave.” Paper presented at the 10th International Conference on Thinking, Harrogate, UK.Google Scholar
  27. Higgins, S., Baumfield, V., Lin, M., Moseley, D., Butterworth, M., Downey, G., Gregson, M., Oberski, I., Rockett, M., & Thacker, D. (2004). Thinking skills approaches to effective teaching and learning: what is the evidence for impact on learners? In Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.Google Scholar
  28. Higgins, S., Hall, E., Baumfield, V., & Moseley, D. (2005). A meta-analysis of the impact of the implementation of thinking skill approaches on pupils. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.Google Scholar
  29. Higgins, S., Baumfield, V., & Hall, E. (2007). Learning skills and the development of learning capabilities. In Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.Google Scholar
  30. Hu, W., Adey, P., Jia, X., Liu, J., Zhang, L., Li, J., & Dong, X. (2011). Effects of a ‘learn to think’ intervention programme on primary school students. British Journal of Educational Psychology, 81(4), 531–557.CrossRefGoogle Scholar
  31. Iqbal, H. M., & Shayer, M. (2000). Accelerating the development of formal thinking in Pakistan secondary school students: Achievement effects and professional development issues. Journal of Research in Science Teaching, 37(3), 259–274.Google Scholar
  32. Jensen, B., & Reichl, J. (2011). Better teacher appraisal and feedback: improving performance. Melbourne: Grattan Institute.Google Scholar
  33. Kaufman, S. B., DeYoung, C. G., Reis, D. L., & Gray, J. R. (2011). General intelligence predicts reasoning ability even for evolutionarily familiar content. Intelligence, 39(5), 311–322.CrossRefGoogle Scholar
  34. Larkin, S. (2006). Collaborative group work and individual development of metacognition in the early years. Research in Science Education, 36(1–2), 7–27.CrossRefGoogle Scholar
  35. Leat, D., & Lin, M. E. I. (2003). Developing a pedagogy of metacognition and transfer: some signposts for the generation and use of knowledge and the creation of research partnerships. British Educational Research Journal, 29(3), 383–415.Google Scholar
  36. Lee, O., & Krajcik, J. (2012). Large-scale interventions in science education for diverse student groups in varied educational settings. Journal of Research in Science Teaching, 49(3), 271–280.CrossRefGoogle Scholar
  37. Leyser, O. (2014). What should be in the biology curriculum? School Science Review, 95(352), 43–45.Google Scholar
  38. Lipman, M. (1976). Philosohy for children. Metaphilosophy, 7(1), 17–39.CrossRefGoogle Scholar
  39. März, V., & Kelchtermans, G. (2013). Sense-making and structure in teachers’ reception of educational reform. A case study on statistics in the mathematics curriculum. Teaching and Teacher Education, 29, 13–24.CrossRefGoogle Scholar
  40. Mbano, N. (2003). The effects of a cognitive acceleration intervention programme on the performance of secondary school pupils in Malawi. International Journal of Science Education, 25(1), 71–87.Google Scholar
  41. McCormack, L. (2009). Cognitive acceleration across the primary-secondary level transition. PhD Thesis, Dublin City University.Google Scholar
  42. McGuinness, C. (1999). From thinking skills to thinking classrooms: Department for Education and Employment (UK).Google Scholar
  43. Moll, L. C. (1990). Vygotsky and education: instructional implications and applications of sociohistorical psychology. Cambridge: Cambridge University Press.Google Scholar
  44. OECD. (2007). PISA 2006: Science competencies for tomorrow’s world. Paris: OECD.CrossRefGoogle Scholar
  45. OECD. (2010). The High cost of low educational performance. Paris: OECD.Google Scholar
  46. Oliver, M. (2011). Towards an understanding of neuroscience for science educators. Studies in Science Education, 47(2): 207–231.Google Scholar
  47. Oliver, M., Venville, G., & Adey, P. (2012). Effects of a cognitive acceleration programme in a low socioeconomic high school in regional australia. International Journal of Science Education 34(9): 1393–1410.Google Scholar
  48. Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles. Psychological Science in the Public Interest, 9(3), 105–119.Google Scholar
  49. Penuel, W. R., & Fishman, B. J. (2012). Large-scale science education intervention research we can use. Journal of Research in Science Teaching, 49(3), 281–304.CrossRefGoogle Scholar
  50. Piaget, J. (1950). The psychology of intelligence. Abingdon, UK: Routledge.Google Scholar
  51. Ramsden, S., Richardson, F. M., Josse, G., Thomas, M. S. C., Ellis, C., Shakeshaft, C., et al. (2011). Verbal and non-verbal intelligence changes in the teenage brain. Nature, 479, 113–116.CrossRefGoogle Scholar
  52. Shayer, M. (1999). Cognitive acceleration through science education II: its effects and scope. International Journal of Science Education, 21(8), 883–902.Google Scholar
  53. Shayer, M. (2000). GCSE 1999: added-value from schools adopting the case intervention. London: King’s College.Google Scholar
  54. Shayer, M. (2003). Not just Piaget; not just Vygotsky, and certainly not Vygotsky as alternative to Piaget. Learning and Instruction, 13(5), 465–485.CrossRefGoogle Scholar
  55. Shayer, M. (2008). Intelligence for education: as described by Piaget and measured by psychometrics. British Journal of Educational Psychology, 78, 1–29.CrossRefGoogle Scholar
  56. Shayer, M., & Adey, P. S. (1992). Accelerating the development of formal thinking in middle and high school students III: testing the permanency of effects. Journal of Research in Science Teaching, 29(10), 1101–1115.CrossRefGoogle Scholar
  57. Shayer, M., & Adey, P. (2002). Learning Intelligence, Open University.Google Scholar
  58. Shayer, M., & Adhami, M. (2007). Fostering cognitive development through the context of mathematics: results of the CAME project. Educational Studies in Mathematics, 64(3), 265–291.CrossRefGoogle Scholar
  59. Shayer, M., & Adhami, M. (2010). Realizing the cognitive potential of children 5–7 with a mathematics focus: post-test and long-term effects of a 2-year intervention. British Journal of Educational Psychology, 80(3), 363–379.CrossRefGoogle Scholar
  60. Shayer, M., & Ginsburg, D. (2009). Thirty years on – a large anti-Flynn effect? (II): 13- and 14-year-olds. Piagetian tests of formal operations norms 1976–2006/7. British Journal of Educational Psychology, 79, 409–418.Google Scholar
  61. Shayer, M., & Wylam, H. (1978). The distribution of Piagetian stages of thinking in British middle and secondary school children. II—14- To 16-year-olds and sex differentials. British Journal of Educational Psychology, 48(1), 62–70.CrossRefGoogle Scholar
  62. Shayer, M., Küchemann, D. E., & Wylam, H. (1976). The distribution of Piagetian stages of thinking in British middle and secondary school children. British Journal of Educational Psychology, 46, 164–173.Google Scholar
  63. Shayer, M., Adey, P., & Wylam, H. (1981). Group tests of cognitive development ideals and a realization. Journal of Research in Science Teaching, 18(2), 157–168.CrossRefGoogle Scholar
  64. Shayer, M., Ginsburg, D., & Coe, R. (2007). Thirty years on - a large anti-Flynn effect? The Piagetian test Volume & Heaviness norms 1975–2003. British Journal of Educational Psychology, 77, 25–41.Google Scholar
  65. Stephenson, J. (2009). Best practice? Advice provided to teachers about the use of brain gym ® in Australian Schools. The Australian Journal of Education, 53(2), 109–124.CrossRefGoogle Scholar
  66. Styles, I. (2008). Uses and abuses of intelligence: studies advancing Spearman and Raven’s quest for non-arbitrary metrics. In J. Raven & J. Raven (Eds.), Linking psychometric and cognitive-developmental frameworks for thinking about intellectual functioning (pp. 69–98). New York, USA: Royal Fireworks.Google Scholar
  67. Taylor, J., Roehrig, A. D., Hensler, B. S., Connor, C. M., & Schatschneider, C. (2010). Teacher quality moderates the genetic effects on early reading. Science, 328(5977), 512–514.CrossRefGoogle Scholar
  68. Topping, K. J., & Trickey, S. (2007). Collaborative philosophical enquiry for schoolchildren: cognitive effects at 10–12 years. British Journal of Educational Psychology, 77, 271–288.CrossRefGoogle Scholar
  69. Trickey, S., & Topping K. J. (2004). ‘Philosophy for children’: a systematic review. Research Papers in Education, 19(3), 365–380.Google Scholar
  70. Trickey, S., & Topping, K. J. (2006). Collaborative philosophical enquiry for school children: socio-emotional effects at 11 to 12 years. School Psychology International, 27(5), 599–614. doi: 10.1177/0143034306073417.CrossRefGoogle Scholar
  71. Vansieleghem, N., & Kennedy, D. (2011). What is Philosophy for Children, what is Philosophy with Children — after Matthew Lipman? Journal of Philosophy of Education, 45(2), 171–182.Google Scholar
  72. Venville, G., Adey, P., Larkin, S., & Robertson, A. (2003). Fostering thinking through science in the early years of schooling. International Journal of Science Education, 25(11), 1313–1331.Google Scholar
  73. Visser, B. A., Ashton, M. C., & Vernon, P. A. (2006). g and the measurement of multiple intelligences: a response to Gardner. Intelligence, 34(5), 507–510.Google Scholar
  74. Vygotsky, L. S. (1986). In A. Kozulin (Ed.), Thought and language. Cambridge: MIT.Google Scholar
  75. Wieman, C. (2007). Why not try a scientific approach to science education? Change: The Magazine of Higher Learning, 39(5), 9–15.Google Scholar
  76. Wiliam, D. (2007). Three practical, policy-focused procedures for determining an accountability test’s instructional sensitivity: III: an index of sensitivity to instruction. Chicago, IL: American Educational Research Association.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  1. 1.School of EducationThe University of NottinghamNottinghamUK
  2. 2.Graduate School of EducationUniversity of Western AustraliaCrawleyAustralia

Personalised recommendations