Skip to main content
Log in

Toward Interdisciplinary Learning: Development and Validation of an Assessment for Interdisciplinary Understanding of Global Carbon Cycling

  • Published:
Research in Science Education Aims and scope Submit manuscript

Abstract

This study developed and validated an assessment that measures interdisciplinary understanding using the topic of carbon cycling, the Interdisciplinary Science Assessment for Carbon Cycling (ISACC). This work was motivated by the need to assess interdisciplinary understanding, defined as the ability to solve problems requiring knowledge and skills from multiple disciplines. Levels of a construct map for interdisciplinary understanding were generated and refined in the construct-modeling design process (Wilson 2005). Key concepts in carbon cycling to be assessed were determined based on experts’ concept maps and analysis of the Next Generation Science Standards. The final version of the ISACC includes 11 multiple-choice (MC) items and eight constructed-response (CR) items covering nine key concepts. Of these 19 items, six are single-disciplinary items and 13 are interdisciplinary items. Four hundred fifty-four students in grades 9–16 were recruited and administered the ISACC. For the CR items, scoring rubrics were developed and used by a group of evaluators to code student responses. Two item response theory models, a two-parameter logistic model and a generalized partial credit model, provided evidence of the construct validity of the items. All items reflected unidimensionality and local independence and showed moderate internal consistency (Cronbach’s alpha = 0.782). All except one were a good fit to the models. The findings suggest that the ISACC is a promising tool to assess interdisciplinary understanding of carbon cycling. Future directions include research into test fairness across gender and ethnic/racial groups and further development to cover typical high school students’ knowledge more thoroughly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • American Association for the Advancement of Science (AAAS). (1989). Science for all Americans. Washington, DC: American Association for the Advancement of Science.

    Google Scholar 

  • American Association for the Advancement of Science (AAAS). (2009). Benchmarks for science literacy on-line. Retrieved from http://www.project2061.org/publications/bsl/online.

  • Australian Curriculum, Assessment and Reporting Authority (ACARA). (2012), The shape of the Australian Curriculum (version 4.0), Retrieved from http://www.acara.edu.au/verve/_resources/The_Shape_of_the_Australian_Curriculum_v4.pdf.

  • Beane, J. A. (1995). Curriculum integration and the disciplines of knowledge. Phi Delta Kappan, 616–622.

  • Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee’s ability. In F. M. Lord & M. R. Novick (Eds.), Statistical theories of mental test scores. MA: Addison-Wesley.

  • Boix Mansilla, V. (2005). Assessing student work at disciplinary crossroads. Change: The Magazine of Higher Learning, 37(1), 14–21.

    Article  Google Scholar 

  • Boix Mansilla, V., & Duraisingh, E. D. (2007). Targeted assessment of students’ interdisciplinary work: an empirically grounded framework proposed. The Journal of Higher Education, 78(2), 215–237.

  • Brown, N. J., & Wilson, M. (2011). A model of cognition: the missing cornerstone of assessment. Educational Psychology Review, 23(2), 221–234.

    Article  Google Scholar 

  • Cai, L., Thissen, D., & du Toit, S. (2016). IRTPRO 3 for Windows [Computer Software]. Skokie, IL: Scientific Software International, Inc.

  • California Department of Education. (1990). The California framework for science instruction. Sacramento: California Department of Education.

    Google Scholar 

  • Chandramohan, B., & Fallows, S. J. (2009). Interdisciplinary learning and teaching in higher education: theory and practice. New York: Routledge.

    Google Scholar 

  • Chen, W. H., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265–289.

    Article  Google Scholar 

  • Chi, M. T. H., & Ceci, S. J. (1987). Content knowledge: Its role, representation, and restructuring in memory development. In H. W. Reese (Ed.), Advances in child development and behavior (Vol. 20, pp. 91–142). New York: Academic Press.

  • Cortina, J. M. (1993). What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology, 78(1), 98–104.

    Article  Google Scholar 

  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. New York: CBS College Publishing.

    Google Scholar 

  • Dewey, J. (1938). Experience and education. New York: The Macmillan company.

    Google Scholar 

  • Dorsey, D. W., Campbell, G. E., Foster, L. L., & Miles, D. E. (1999). Assessing knowledge structures: relations with experience and posttraining performance. Human Performance, 12(1), 31–57.

    Article  Google Scholar 

  • Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. New York: Psychology Press.

    Google Scholar 

  • Fleiss, J. L. (1986). Reliability of measurement. The design and analysis of clinical experiments, 1–32.

  • Golding, C. (2009). Integrating the disciplines: successful interdisciplinary subjects. Melbourn: Centre for the Study of Higher Education, University of Melbourne.

  • Hambelton, R. K. (1989). Principles and selected applications of item response theory. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 147–200). New York: Macmillan.

    Google Scholar 

  • Hartley, L. M., Momsen, J., Maskiewicz, A., & D’Avanzo, C. (2012). Energy and matter: differences in discourse in physical and biological sciences can be confusing for introductory biology students. BioScience, 62(5), 488–496.

    Article  Google Scholar 

  • Hirsh, S. (2011). Building professional development to support new student assessment systems. Retrieved from https://learningforward.org/docs/pdf/stephanie_hirsh-building_professional_development.pdf.

  • Hooper, D., Coughlan, J., & Mullen, M. (2008). Structural equation modelling: guidelines for determining model fit. Electronic Journal of Business Research Methods, 6(1), 53–60.

    Google Scholar 

  • Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55.

    Article  Google Scholar 

  • Jho, H., Hong, O., & Song, J. (2016). An analysis of STEM/STEAM teacher education in Korea with a case study of two schools from a community of practice perspective. Eurasia Journal of Mathematics, Science & Technology Education, 12(7).

  • Johnson-Laird, P. N. (1980). Mental models in cognitive science. Cognitive Science, 4(1), 71–115.

    Article  Google Scholar 

  • Klein, J. T. (1990). Interdisciplinarity: history, theory, and practice. Detroit: Wayne State University Press.

    Google Scholar 

  • Kline, R. (2015). Principles and practice of structural equation modeling (4th ed.). New York: Guilford Publications.

    Google Scholar 

  • Krajcik, J. S., McNeill, K. L., & Reiser, B. J. (2008). Learning-goals-driven design model: developing curriculum materials that align with national standards and incorporate project-based pedagogy. Science Education, 92(1), 1–32. https://doi.org/10.1002/sce.20240.

    Article  Google Scholar 

  • Labaree, D. F. (2005). Progressivism, schools and schools of education: an American romance. Paedagogica Historica, 41(1–2), 275–288.

    Article  Google Scholar 

  • Lead States, N. G. S. S. (2013). Next Generation Science Standards: for states, by states. Washington, DC: National Academy Press.

    Google Scholar 

  • Linn, M. C. (2006). The knowledge integration perspective on learning and instruction. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 243–264). New York: Cambridge University Progressivism, schools and schools of education: an American romance. Press.

    Google Scholar 

  • Liu, X. (2010). Using and developing measurement instruments in science education: a Rasch modeling approach. Charlotte: Information Age Pub.

    Google Scholar 

  • Liu, O. L., Lee, H. S., Hofstetter, C., & Linn, M. C. (2008). Assessing knowledge integration in science: construct, measures, and evidence. Educational Assessment, 13(1), 33–55. https://doi.org/10.1080/10627190801968224.

    Article  Google Scholar 

  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale: Lawrence Erlbaum Associates.

    Google Scholar 

  • Lynn, M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382–386.

    Article  Google Scholar 

  • Maskiewicz, A. C., Griscom, H. P., & Welch, N. T. (2012). Using targeted active-learning exercises and diagnostic question clusters to improve students’ understanding of carbon cycling in ecosystems. CBE-Life Sciences Education, 11(1), 58–67.

    Article  Google Scholar 

  • McComas, W. F., & Wang, H. A. (1998). Blended science: the rewards and challenges of integrating the science disciplines for instruction. School Science and Mathematics, 98(6), 340–348.

    Article  Google Scholar 

  • Metz, K. E. (1995). Reassessment of developmental constraints on children’s science instruction. Review of Educational Research, 65(2), 93–127.

    Article  Google Scholar 

  • Muraki, E. (1992). A generalized partial credit model: application of an EM algorithm. Applied Psychological Measurement, 16(2), 159–176.

    Article  Google Scholar 

  • Muthén, L. K., & Muthén, B. O. (1998-2015). Mplus User’s Guide. (7 ed.). Los Angeles, CA: Muthén & Muthén.

  • National Academy of Sciences. (2005). Facilitating interdisciplinary research. Washington, DC: National Academies.

    Google Scholar 

  • National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.

    Google Scholar 

  • National Research Council. (2012). A framework for K-12 science education: practices, crosscutting concepts, and core ideas. Washington, DC: National Academy Press.

    Google Scholar 

  • National Science Teachers Association. (1964). Theory into action in science curriculum development. Washington, DC: National Science Teachers Association.

    Google Scholar 

  • Neurath, O. (1996). Unified science as encyclopedic integration. Logical empiricism at its peak: Schlick, Carnap, and Neurath, 309-335.

  • Orlando, M., & Thissen, D. (2000). New item fit indices for dichotomous item response theory models. Applied Psychological Measurement, 24, 50–64.

    Article  Google Scholar 

  • Orlando, M., & Thissen, D. (2003). Further investigation of the performance of S-X2: an item fit index for use with dichotomous item response theory models. Applied Psychological Measurement, 27, 289–298.

    Article  Google Scholar 

  • Pellegrino, J. W., Krajcik, J. S., Stevens, S. Y., Swarat, S., Shin, N., & Delgado, C. (2008). Using construct-centered design to align curriculum, instruction, and assessment development in emerging science. In V. Kanselaar, V. Jonker, P. A. Kirschner, & F. Prins (Eds.), ICLS’ 08: international perspectives in the learning sciences: creating a learning world (Vol. 3, pp. 314–321). Utrecht: International Society of the Learning Sciences.

    Google Scholar 

  • Piaget, J. (1978). The development of thought: equilibration of cognitive structures. New York: Viking Press.

    Google Scholar 

  • Reckase, M. D. (1979). Unifactor latent trait models applied to multifactor tests: results and implications. Journal of Educational and Behavioral Statistics, 4(3), 207–230.

    Article  Google Scholar 

  • Reiska, P., Soika, K., & Cañas, A. J. (2018). Using concept mapping to measure changes in interdisciplinary learning during high school. Knowledge Management & E-Learning: An International Journal (KM&EL), 10(1), 1–24.

    Google Scholar 

  • Reynolds, C. R., & Kaiser, S. M. (1990). Bias in assessment of aptitude. In C. R. Reynolds & R. W. Kamphaus (Eds.), Handbook of psychological and educational assessment of children: intelligence and achievement (pp. 611–653). New York: Guilford.

    Google Scholar 

  • Rice, J., & Neureither, B. (2006). An integrated physical, earth, and life science course for pre-service K-8 teachers. Journal of Geoscience Education, 54(3), 255–261.

    Article  Google Scholar 

  • Rowntree, D. (1982). A dictionary of education. Totowa, NJ: Barnes & Noble Books.

  • Sabbag, A. G., & Zieffler, A. (2015). Assessing learning outcomes: an analysis of the goals-2 instrument. Statistics Education Research Journal, 14(2), 93–116.

    Article  Google Scholar 

  • Schaal, S., Bogner, F. X., & Girwidz, R. (2010). Concept mapping assessment of media assisted learning in interdisciplinary science education. Research in Science Education, 40(3), 339–352.

    Article  Google Scholar 

  • Scottish Government (2008), Curriculum for Excellence: building the curriculum 3, Retrieved from http://www.scotland.gov.uk/Resource/Doc/226155/0061245.pdf.

  • Shen, J., Liu, O. L., & Sung, S. (2014). Designing interdisciplinary assessments in sciences for college students: an example on osmosis. International Journal of Science Education, 36(11), 1773–1793. https://doi.org/10.1080/09500693.2013.879224.

    Article  Google Scholar 

  • Shin, N., Stevens, S. Y., & Krajcik, J. S. (2010). Tracking student learning over time using construct-centred design. In S. Rodrigues (Ed.), Using analytical frameworks for classroom research: collecting data and analysing narrative (pp. 38–58). London: Routledge.

    Google Scholar 

  • Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428.

    Article  Google Scholar 

  • Sokolowski, J. A., & Banks, C. M. (2010). Modeling and simulation fundamentals: theoretical underpinnings and practical domains. Hoboken: John Wiley & Sons, Inc.

    Book  Google Scholar 

  • Spelt, E. J., Biemans, H. J., Tobi, H., Luning, P. A., & Mulder, M. (2009). Teaching and learning in interdisciplinary higher education: a systematic review. Educational Psychology Review, 21(4), 365–378.

    Article  Google Scholar 

  • Stichweh, R. (2003). Differentiation of scientific disciplines: causes and consequences. Unity of Knowledge in Transdisciplinary Research for Sustainability, 1, 1–8.

    Google Scholar 

  • Taber, K. S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 1–24.

  • Tanner, K., & Allen, D. (2005). Approaches to biology teaching and learning: understanding the wrong answers—teaching toward conceptual change. Cell Biology Education, 4(2), 112–117.

    Article  Google Scholar 

  • Teresi, J. A., Ocepek-Welikson, K., Ramirez, M., Kleinman, M., Ornstein, K., & Siu, A. (2015). Evaluation of measurement equivalence of the family satisfaction with the end-of-life care in an ethnically diverse cohort: tests of differential item functioning. Palliative Medicine, 29(1), 83–96. https://doi.org/10.1177/0269216314545802.

    Article  Google Scholar 

  • Van Merriënboer, J. J. G. (1997). Training complex cognitive skills: a four-component instructional design model for technical training. Englewood Cliffs: Educational Technology.

    Google Scholar 

  • Versprille, A. N. (2014). General chemistry students’ understanding of the chemistry underlying climate science (Unpublished doctoral dissertation). Purdue University, West Lafayette, IN.

  • Weingart, P. (2010). A short history of knowledge formations. In R. Frodemann, K. J. Thomson, & C. Mitcham (Eds.), The Oxford handbook of interdisciplinarity (pp. 3–14). Oxford: Oxford University Press.

    Google Scholar 

  • Wiggins, G. P., & McTighe, J. (1998). Understanding by design. Alexandria: Association for Supervision and Curriculum.

    Google Scholar 

  • Wilson, M. (2005). Constructing measures: an item response modeling approach. Mahwah: Lawrence Erlbaum Associates.

    Google Scholar 

  • You, H. S. (2016). Toward interdisciplinary science learning: Development of an assessment for interdisciplinary understanding of ‘carbon cycling’ (Unpublished doctoral dissertation). The University of Texas at Austin, Austin, Texas.

  • You, H. S., Marshall, J. A., & Delgado, C. (2018). Assessing students' disciplinary and interdisciplinary understanding of global carbon cycling. Journal of Research in Science Teaching, 55(3), 377–398.

  • Yu, C. Y. (2002). Evaluating cutoff criteria of model fit indices for latent variable models with binary and continuous outcomes (Unpublished doctoral dissertation). University of California, Los Angeles, CA.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hye Sun You.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

You, H.S., Marshall, J.A. & Delgado, C. Toward Interdisciplinary Learning: Development and Validation of an Assessment for Interdisciplinary Understanding of Global Carbon Cycling. Res Sci Educ 51, 1197–1221 (2021). https://doi.org/10.1007/s11165-019-9836-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11165-019-9836-x

Keywords

Navigation