Abstract
This chapter addresses students’ data interpretation, a key NGSS inquiry practice, with which students have several different types of difficulties. In this work, we unpack the difficulties associated with data interpretation from those associated with warranting claims. We do this within the context of Inq-ITS (Inquiry Intelligent Tutoring System), a lightweight LMS, providing computer-based assessment and tutoring for science inquiry practices/skills. We conducted a systematic analysis of a subset of our data to address whether our scaffolding is supporting students in the acquisition and transfer of these inquiry skills. We also describe an additional study, which used Bayesian Knowledge Tracing (Corbett and Anderson. User Model User-Adapt Interact 4(4):253–278, 1995), a computational approach allowing for the analysis of the fine-grained sub-skills underlying our practices of data interpretation and warranting claims.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aleven, V., & Koedinger, K. (2000). Limitations of student control: Do students know when they need help? In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), Proceedings of the 5th international conference on intelligent tutoring systems (pp. 292–303). Berlin: Springer.
Aleven, V., McLaren, B., Roll, I., & Koedinger, K. (2004). Toward tutoring help seeking: Applying cognitive modeling to meta-cognitive skills. In J. C. Lester, R. M. Vicario, & F. Paraguaçu (Eds.), Proceedings of seventh international conference on intelligent tutoring systems (pp. 227–239). Berlin: Springer.
Anderson, J. R., Corbett, A. T., Koedinger, K. R., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. The Journal of the Learning Sciences, 4(2), 167–207.
Baker, R., Corbett, A., Gowda, S., Wagner, A., MacLaren, B., Kauffman, L., Mitchell, A., & Giguere, S. (2010). Contextual slip and prediction of student performance after use of an intelligent tutor. In Proceedings of the 18th annual conference on user modeling, adaptation, and personalization (pp. 52–63). Berlin: Springer.
Baker, R., Gowda, S., & Corbett, A. (2011) Automatically detecting a student’s preparation for future learning: Help use is key. In Proceedings of the 4th international conference on educational data mining (pp. 179–188).
Beck, J., Chang, K. M., Mostow, J., & Corbett, A. (2008). Does help help? Introducing the bayesian evaluation and assessment methodology. In Intelligent tutoring systems (pp. 383–394). Berlin: Springer.
Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94(5), 765–793.
Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55.
Buckley, B. C., Gobert, J. D., & Horwitz, P. (2006). Using log files to track students’ model-based inquiry. Paper presented at the 7th international conference of the learning sciences, Bloomington, IN.
Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Review of Educational Research, 63, 1–49.
Chinn, C. A., Duschl, R. A., Duncan, R. G., Buckland, L. A., & Pluta, W. J. (2008, June). A microgenetic classroom study of learning to reason scientifically through modeling and argumentation. In Proceedings of the 8th international conference on International conference for the learning sciences (Vol. 3, pp. 14–15). International Society of the Learning Sciences.
Corbett, A., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modeling and User-Adapted Interaction, 4(4), 253–278.
de Jong, T. (2006). Computer simulations: Technological advances in inquiry learning. Science, 312, 532–533.
Deters, K. M. (2005). Student opinions regarding inquiry-based labs. Journal of Chemical Education, 82(8), 1178–1180.
Dunbar, K. (1993). Concept discovery in a scientific domain. Cognitive Science: A Multidisciplinary Journal, 17(3), 397–434.
Edelson, D. C., O’Neill, D. K., Gomez, L. M., & D’Amico, L. (1995). A design for effective support of inquiry and collaboration. In The first international conference on computer support for collaborative learning (pp. 107–111). Mahwah: Erlbaum.
Fadel, C., Honey, M., & Pasnick, S. (2007). Assessment in the age of innovation. Education Week, 26(38), 34–40.
Glaser, R., Schauble, L., Raghavan, K., & Zeitz, C. (1992). Scientific reasoning across different domains. In E. DeCorte, M. Linn, H. Mandl, & L. Verschaffel (Eds.), Computer-based learning environments and problem-solving (pp. 345–371). Heidelberg: Springer.
Gobert, J. (2015). Microworlds. In R. Gunstone (Ed.), Encyclopedia of science education (pp. 638–639). Netherlands: Springer.
Gobert, J. D. (2016). Op-Ed: Educational data mining can be leveraged to improve assessment of science skills. US News & World Report. http://www.usnews.com/news/articles/2016-05-13/op-ed-educational-data-mining-can-enhance-science-education.
Gobert, J. D., & Sao Pedro, M. A. (2017). Inq-ITS: Design decisions used for an inquiry intelligent system that both assesses and scaffolds students as they learn. Invited chapter in A. A. Rupp, & J. Leighton (Co-Eds), Handbook of cognition and assessment. New York: Wiley/Blackwell.
Gobert, J. D., Sao Pedro, M. A., Baker, R. S., Toto, E., & Montalvo, O. (2012). Leveraging educational data mining for real-time performance assessment of scientific inquiry skills within microworlds. Journal of Educational Data Mining, 4(1), 111–143.
Gobert, J. D., Sao Pedro, M., Raziuddin, J., & Baker, R. S. (2013). From log files to assessment metrics: Measuring students’ science inquiry skills using educational data mining. The Journal of the Learning Sciences, 22(4), 521–563.
Gobert, J. D., Kim, Y. J., Sao Pedro, M. A., Kennedy, M., & Betts, C. G. (2015). Using educational data mining to assess students’ skills at designing and conducting experiments within a complex systems microworld. Thinking Skills and Creativity, 18, 81–90.
Gobert, J. D., Baker, R. S., & Sao Pedro, M. A. (2016a). U.S. patent no. 9,373,082. Washington, DC: U.S. Patent and Trademark Office.
Gobert, J., Sao Pedro, M., Betts, C., & Baker, R. S. (2016b). U.S. patent no. 9,564,057. Washington, DC: U.S. Patent and Trademark Office.
Gotwals, A. W., & Songer, N. B. (2009). Reasoning up and down a food chain: Using an assessment framework to investigate students’ middle knowledge. Science Education, 94(2), 259–281.
Hanley, J. A., & McNeil, B. J. (1982). The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology, 143(1), 29–36.
Harrison, A. M., & Schunn, C. D. (2004). The transfer of logically general scientific reasoning skills. In K. Forbus, D. Gentner, & T. Regier (Eds.), Proceedings of the 26th annual conference of the cognitive science society (pp. 541–546). Mahwah: Erlbaum.
Hilton, M., & Honey, M. A. (Eds.). (2011). Learning science through computer games and simulations. Washington, DC: National Academies Press.
Kang, H., Thompson, J., & Windschitl, M. (2014). Creating opportunities for students to show what they know: The role of scaffolding in assessment tasks. Science Education, 98(4), 674–704.
Kanari, Z., & Millar, R. (2004). Reasoning from data: How students collect and interpret data in science investigations. Journal of Research in Science Teaching, 41(7), 748–769.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86.
Klahr, D., & Dunbar, K. (1988). Dual search space during scientific reasoning. Cognitive Science, 12, 1–48.
Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15(10), 661–661.
Klayman, J., & Ha, Y. W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94(2), 211–228.
Koedinger, K., & Corbett, A. (2006). Cognitive tutors: Technology bringing learning sciences to the classroom. In R. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 61–77). New York: Cambridge University Press.
Koedinger, K., Pavlik Jr, P. I., Stamper, J., Nixon, T., & Ritter, S. (2010). Avoiding problem selection thrashing with conjunctive knowledge tracing. In Educational data mining 2011.
Krajcik, J., Blumenfeld, P., Marx, R., Bass, K., Fredricks, J., & Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. The Journal of the Learning Sciences, 7, 313–350.
Krajcik, J., Marx, R., Blumenfeld, P., Soloway, E., & Fishman, B. (2000). Inquiry based science supported by technology: Achievement among urban middle school students. Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans.
Kuhn, D. (1991). The skills of argument. Cambridge, MA: Cambridge Press.
Kuhn, D. (2005). Education for thinking. Cambridge, MA: Harvard University Press.
Kuhn, D., Schauble, L., & Garcia-Mila, M. (1992). Cross-domain development of scientific reasoning. Cognition and Instruction, 9(4), 285–327.
Kuhn, D., Garcia-Mila, M., Zohar, A., Andersen, C., White, S., Klahr, D., & Carver, S. (1995). Strategies of knowledge acquisition. Monographs of the Society for Research in Child Development, 60(4), 1–157.
Li, H., Gobert, J., & Dickler, R. (2017). Dusting off the messy middle: Assessing students’ inquiry skills through doing and writing. In E. André, R. Baker, X. Hu, M. Rodrigo, & B. du Boulay (Eds.), Lecture Notes in Computer Science (Vol. 10331, pp. 175–187). Cham: Springer.
McElhaney, K., & Linn, M. (2008). Impacts of students’ experimentation using a dynamic visualization on their understanding of motion. In Proceedings of the 8th international conference of the learning sciences (pp. 51–58). Netherlands: International Society of the Learning Sciences.
McElhaney, K., & Linn, M. (2010). Helping students make controlled experiments more informative. In K. Gomez, L. Lyons, & J. Radinsky (Eds.), Learning in the disciplines: Proceedings of the 9th international conference of the learning sciences (pp. 786–793). Chicago: International Society of the Learning Sciences.
McNeill, K. L., & Krajcik, J. (2007). Middle school students’ use of appropriate and inappropriate evidence in writing scientific explanations. In M. Lovett & P. Shah (Eds.), Thinking with data (pp. 233–265). New York: Taylor & Francis Group, LLC.
McNeill, K. L., & Krajcik, J. S. (2011). Supporting grade 5–8 students in constructing explanations in science: The claim, evidence, and reasoning framework for talk and writing. Upper Saddle River: Pearson.
McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students’ construction of scientific explanations by fading scaffolds in instructional materials. The Journal of the Learning Sciences, 15(2), 153–191.
Mislevy, R. J., Behrens, J. T., DiCerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence centered design, psychometrics, and data mining. Journal of Educational Data Mining, 4(1), 11–48.
Moussavi, R., Kennedy, M., Sao Pedro, M. A., & Gobert, J. D. (2015). Evaluating a scaffolding design to support students’ data interpretation skills within a simulation-based inquiry environment. Presented at the meeting of the American Education Research Association, Chicago.
Moussavi, R., Sao Pedro, M., & Gobert, J. D. (2016a). Evaluating the efficacy of real-time scaffolding for data interpretation skills. Paper presented at the meeting of the American Education Research Association, Washington, DC.
Moussavi, R., Sao Pedro, M., & Gobert, J. D. (2016b). The effect of scaffolding on the immediate transfer of students’ data interpretation skills within science topics. Presented at the 12th International Conference of the Learning Sciences, Singapore.
National Research Council. (2011). Successful K-12 STEM education: Identifying effective approaches in science, technology, engineering, and mathematics. Washington, D.C.: National Academies Press.
Next Generation Science Standards Lead States. (2013). Next generation science standards: For states, by states. Washington, DC: The National Academies Press.
Njoo, M., & de Jong, T. (1993). Exploratory learning with a computer simulations for control theory: Learning processes and instructional support. Journal of Research in Science Teaching, 30, 821–844.
Organization for Economic Cooperation and Development. (2018). PISA 2015 results in focus: What 15-year-olds know and what they can do with what they know. Paris: Organization for Economic Cooperation and Development.
Pellegrino, J., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.
Quinn, J., & Alessi, S. (1994). The effects of simulation complexity and hypothesis-generation strategy on learning. Journal of Research on Computing in Education, 27(1), 75–91.
Reimann, P. (1991). Detecting functional relations in a computerized discovery environment. Learning and Instruction, 1(1), 45–65.
Roll, I., Yee, N., & Briseno, A. (2014). Students’ adaptation and transfer of strategies across levels of scaffolding in an exploratory environment. In Intelligent tutoring systems (pp. 348–353). Switzerland: Springer International Publishing.
Sao Pedro, M. (2013). Real-time assessment, prediction, and scaffolding of middle school students’ data collection skills within physical science simulations (Doctoral dissertation). Worcester: Worcester Polytechnic Institute.
Sao Pedro, M. A., Baker, R. S., Montalvo, O., Nakama, A., & Gobert, J. D. (2010). Using text replay tagging to produce detectors of systematic experimentation behavior patterns. In R. Baker, A. Merceron, & P. Pavlik (Eds.), Proceedings of the 3rd international conference on educational data mining (pp. 181–190).
Sao Pedro, M., Baker, R., & Gobert, J. (2012a). Improving construct validity yields better models of systematic inquiry, even with less information. In Proceedings of the 20th conference on user modeling, adaptation, and personalization (pp. 249–260). Berlin: Springer.
Sao Pedro, M., Gobert, J., & Baker, R. (2012b). Assessing the learning and transfer of data collection inquiry skills using educational data mining on students’ log files. Paper presented at The Annual Meeting of the American Educational Research Association, Vancouver.
Sao Pedro, M., Baker, R., & Gobert, J. (2013a). Incorporating scaffolding and tutor context into bayesian knowledge tracing to predict inquiry skill acquisition. In S. K. D’Mello, R. A. Calvo, & A. Olney (Eds.), Proceedings of the 6th international conference on educational data mining (pp. 185–192).
Sao Pedro, M., Baker, R., & Gobert, J. (2013b). What different kinds of stratification can reveal about the generalizability of data-mined skill assessment models. In Proceedings of the 3rd conference on learning analytics and knowledge.
Sao Pedro, M., Baker, R., Gobert, J., Montalvo, O., & Nakama, A. (2013c). Leveraging machine-learned detectors of systematic inquiry behavior to estimate and predict transfer of inquiry skill. User Modeling and User-Adapted Interaction, 23(1), 1–39.
Sao Pedro, M. A., Gobert, J. D., & Betts, C. G. (2014). Towards scalable assessment of performance-based skills: Generalizing a detector of systematic science inquiry to a simulation with a complex structure. In Intelligent tutoring systems (pp. 591–600). Switzerland: Springer International Publishing.
Sao Pedro, M., Gobert, J., Toto, E., & Paquette, L. (2015). Assessing transfer of students’ data analysis skills across physical science simulations. In I. Bejar (Chair), The state of the art in automated scoring of science inquiry tasks. Symposium conducted at the meeting of the American Education Research Association, Chicago.
Schauble, L. (1990). Belief revision in children: The role of prior knowledge and strategies for generating evidence. Journal of Experimental Child Psychology, 49, 31–57.
Schauble, L., Klopfer, L. E., & Raghavan, K. (1991). Students’ transition from an engineering model to a science model of experimentation. Journal of Research in Science Teaching, 28(9), 859–882.
Schauble, L., Glaser, R., Duschl, R. A., Schulze, S., & John, J. (1995). Students’ understanding of the objectives and procedures of experimentation in the science classroom. The Journal of the Learning Sciences, 4(2), 131–166.
Schneider, R., Krajcik, J., & Blumenfeld, P. (2005). Enacting reform-based science materials: The range of teacher enactments in reform classrooms. Journal of Research in Science Teaching, 42(3), 283–312.
Schunn, C. D., & Anderson, J. R. (1998). Scientific discovery. In J. R. Anderson (Ed.), The atomic components of thought (pp. 385–428). Mahwah: Lawrence Erlbaum Associates.
Schunn, C. D., & Anderson, J. R. (1999). The generality/specificity of expertise in scientific reasoning. Cognitive Science, 23(3), 337–370.
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.
Shute, V., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery world: Smithtown. Interactive Learning Environments, 1, 55–71.
Staer, H., Goodrum, D., & Hackling, M. (1998). High school laboratory work in Western Australia: Openness to inquiry. Research in Science Education, 28(2), 219–228.
Toulmin, S. (1958). The uses of argument. New York: Cambridge University Press.
Tsirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51(1), 1–10.
van Joolingen, W. R., & de Jong, T. (1991a). Supporting hypothesis generation by learners exploring an interactive computer simulation. Instructional Science, 20(5), 389–404.
van Joolingen, W. R., & de Jong, T. (1991b). Characteristics of simulations for instructional settings. Education and Computing, 6(3-4), 241–262.
van Joolingen, W. R., & de Jong, T. (1993). Exploring a domain through a computer simulation: Traversing variable and relation space with the help of a hypothesis scratchpad. In D. Towne, T. de Jong, & H. Spada (Eds.), Simulation-based experiential learning (pp. 191–206). Berlin: Springer.
van Joolingen, W. R., & de Jong, T. (1997). An extended dual search space model of scientific discovery learning. Instructional Science, 25(5), 307–346.
Williamson, D., Mislevy, R., & Bejar, I. (2006). Automated scoring of complex tasks in computer-based testing. Mahwah: Lawrence Erlbaum Associates.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Gobert, J.D., Moussavi, R., Li, H., Sao Pedro, M., Dickler, R. (2018). Real-Time Scaffolding of Students’ Online Data Interpretation During Inquiry with Inq-ITS Using Educational Data Mining. In: Auer, M., Azad, A., Edwards, A., de Jong, T. (eds) Cyber-Physical Laboratories in Engineering and Science Education. Springer, Cham. https://doi.org/10.1007/978-3-319-76935-6_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-76935-6_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-76934-9
Online ISBN: 978-3-319-76935-6
eBook Packages: EducationEducation (R0)