Skip to main content

Real-Time Scaffolding of Students’ Online Data Interpretation During Inquiry with Inq-ITS Using Educational Data Mining

  • Chapter
  • First Online:
Cyber-Physical Laboratories in Engineering and Science Education

Abstract

This chapter addresses students’ data interpretation, a key NGSS inquiry practice, with which students have several different types of difficulties. In this work, we unpack the difficulties associated with data interpretation from those associated with warranting claims. We do this within the context of Inq-ITS (Inquiry Intelligent Tutoring System), a lightweight LMS, providing computer-based assessment and tutoring for science inquiry practices/skills. We conducted a systematic analysis of a subset of our data to address whether our scaffolding is supporting students in the acquisition and transfer of these inquiry skills. We also describe an additional study, which used Bayesian Knowledge Tracing (Corbett and Anderson. User Model User-Adapt Interact 4(4):253–278, 1995), a computational approach allowing for the analysis of the fine-grained sub-skills underlying our practices of data interpretation and warranting claims.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Aleven, V., & Koedinger, K. (2000). Limitations of student control: Do students know when they need help? In G. Gauthier, C. Frasson, & K. VanLehn (Eds.), Proceedings of the 5th international conference on intelligent tutoring systems (pp. 292–303). Berlin: Springer.

    Chapter  Google Scholar 

  • Aleven, V., McLaren, B., Roll, I., & Koedinger, K. (2004). Toward tutoring help seeking: Applying cognitive modeling to meta-cognitive skills. In J. C. Lester, R. M. Vicario, & F. Paraguaçu (Eds.), Proceedings of seventh international conference on intelligent tutoring systems (pp. 227–239). Berlin: Springer.

    Chapter  Google Scholar 

  • Anderson, J. R., Corbett, A. T., Koedinger, K. R., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. The Journal of the Learning Sciences, 4(2), 167–207.

    Article  Google Scholar 

  • Baker, R., Corbett, A., Gowda, S., Wagner, A., MacLaren, B., Kauffman, L., Mitchell, A., & Giguere, S. (2010). Contextual slip and prediction of student performance after use of an intelligent tutor. In Proceedings of the 18th annual conference on user modeling, adaptation, and personalization (pp. 52–63). Berlin: Springer.

    Chapter  Google Scholar 

  • Baker, R., Gowda, S., & Corbett, A. (2011) Automatically detecting a student’s preparation for future learning: Help use is key. In Proceedings of the 4th international conference on educational data mining (pp. 179–188).

    Google Scholar 

  • Beck, J., Chang, K. M., Mostow, J., & Corbett, A. (2008). Does help help? Introducing the bayesian evaluation and assessment methodology. In Intelligent tutoring systems (pp. 383–394). Berlin: Springer.

    Google Scholar 

  • Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94(5), 765–793.

    Article  Google Scholar 

  • Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55.

    Article  Google Scholar 

  • Buckley, B. C., Gobert, J. D., & Horwitz, P. (2006). Using log files to track students’ model-based inquiry. Paper presented at the 7th international conference of the learning sciences, Bloomington, IN.

    Google Scholar 

  • Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Review of Educational Research, 63, 1–49.

    Article  Google Scholar 

  • Chinn, C. A., Duschl, R. A., Duncan, R. G., Buckland, L. A., & Pluta, W. J. (2008, June). A microgenetic classroom study of learning to reason scientifically through modeling and argumentation. In Proceedings of the 8th international conference on International conference for the learning sciences (Vol. 3, pp. 14–15). International Society of the Learning Sciences.

    Google Scholar 

  • Corbett, A., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modeling and User-Adapted Interaction, 4(4), 253–278.

    Article  Google Scholar 

  • de Jong, T. (2006). Computer simulations: Technological advances in inquiry learning. Science, 312, 532–533.

    Article  Google Scholar 

  • Deters, K. M. (2005). Student opinions regarding inquiry-based labs. Journal of Chemical Education, 82(8), 1178–1180.

    Article  Google Scholar 

  • Dunbar, K. (1993). Concept discovery in a scientific domain. Cognitive Science: A Multidisciplinary Journal, 17(3), 397–434.

    Article  Google Scholar 

  • Edelson, D. C., O’Neill, D. K., Gomez, L. M., & D’Amico, L. (1995). A design for effective support of inquiry and collaboration. In The first international conference on computer support for collaborative learning (pp. 107–111). Mahwah: Erlbaum.

    Google Scholar 

  • Fadel, C., Honey, M., & Pasnick, S. (2007). Assessment in the age of innovation. Education Week, 26(38), 34–40.

    Google Scholar 

  • Glaser, R., Schauble, L., Raghavan, K., & Zeitz, C. (1992). Scientific reasoning across different domains. In E. DeCorte, M. Linn, H. Mandl, & L. Verschaffel (Eds.), Computer-based learning environments and problem-solving (pp. 345–371). Heidelberg: Springer.

    Chapter  Google Scholar 

  • Gobert, J. (2015). Microworlds. In R. Gunstone (Ed.), Encyclopedia of science education (pp. 638–639). Netherlands: Springer.

    Chapter  Google Scholar 

  • Gobert, J. D. (2016). Op-Ed: Educational data mining can be leveraged to improve assessment of science skills. US News & World Report. http://www.usnews.com/news/articles/2016-05-13/op-ed-educational-data-mining-can-enhance-science-education.

  • Gobert, J. D., & Sao Pedro, M. A. (2017). Inq-ITS: Design decisions used for an inquiry intelligent system that both assesses and scaffolds students as they learn. Invited chapter in A. A. Rupp, & J. Leighton (Co-Eds), Handbook of cognition and assessment. New York: Wiley/Blackwell.

    Google Scholar 

  • Gobert, J. D., Sao Pedro, M. A., Baker, R. S., Toto, E., & Montalvo, O. (2012). Leveraging educational data mining for real-time performance assessment of scientific inquiry skills within microworlds. Journal of Educational Data Mining, 4(1), 111–143.

    Google Scholar 

  • Gobert, J. D., Sao Pedro, M., Raziuddin, J., & Baker, R. S. (2013). From log files to assessment metrics: Measuring students’ science inquiry skills using educational data mining. The Journal of the Learning Sciences, 22(4), 521–563.

    Article  Google Scholar 

  • Gobert, J. D., Kim, Y. J., Sao Pedro, M. A., Kennedy, M., & Betts, C. G. (2015). Using educational data mining to assess students’ skills at designing and conducting experiments within a complex systems microworld. Thinking Skills and Creativity, 18, 81–90.

    Article  Google Scholar 

  • Gobert, J. D., Baker, R. S., & Sao Pedro, M. A. (2016a). U.S. patent no. 9,373,082. Washington, DC: U.S. Patent and Trademark Office.

    Google Scholar 

  • Gobert, J., Sao Pedro, M., Betts, C., & Baker, R. S. (2016b). U.S. patent no. 9,564,057. Washington, DC: U.S. Patent and Trademark Office.

    Google Scholar 

  • Gotwals, A. W., & Songer, N. B. (2009). Reasoning up and down a food chain: Using an assessment framework to investigate students’ middle knowledge. Science Education, 94(2), 259–281.

    Google Scholar 

  • Hanley, J. A., & McNeil, B. J. (1982). The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology, 143(1), 29–36.

    Article  Google Scholar 

  • Harrison, A. M., & Schunn, C. D. (2004). The transfer of logically general scientific reasoning skills. In K. Forbus, D. Gentner, & T. Regier (Eds.), Proceedings of the 26th annual conference of the cognitive science society (pp. 541–546). Mahwah: Erlbaum.

    Google Scholar 

  • Hilton, M., & Honey, M. A. (Eds.). (2011). Learning science through computer games and simulations. Washington, DC: National Academies Press.

    Google Scholar 

  • Kang, H., Thompson, J., & Windschitl, M. (2014). Creating opportunities for students to show what they know: The role of scaffolding in assessment tasks. Science Education, 98(4), 674–704.

    Article  Google Scholar 

  • Kanari, Z., & Millar, R. (2004). Reasoning from data: How students collect and interpret data in science investigations. Journal of Research in Science Teaching, 41(7), 748–769.

    Article  Google Scholar 

  • Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86.

    Article  Google Scholar 

  • Klahr, D., & Dunbar, K. (1988). Dual search space during scientific reasoning. Cognitive Science, 12, 1–48.

    Article  Google Scholar 

  • Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15(10), 661–661.

    Article  Google Scholar 

  • Klayman, J., & Ha, Y. W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94(2), 211–228.

    Article  Google Scholar 

  • Koedinger, K., & Corbett, A. (2006). Cognitive tutors: Technology bringing learning sciences to the classroom. In R. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 61–77). New York: Cambridge University Press.

    Google Scholar 

  • Koedinger, K., Pavlik Jr, P. I., Stamper, J., Nixon, T., & Ritter, S. (2010). Avoiding problem selection thrashing with conjunctive knowledge tracing. In Educational data mining 2011.

    Google Scholar 

  • Krajcik, J., Blumenfeld, P., Marx, R., Bass, K., Fredricks, J., & Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. The Journal of the Learning Sciences, 7, 313–350.

    Article  Google Scholar 

  • Krajcik, J., Marx, R., Blumenfeld, P., Soloway, E., & Fishman, B. (2000). Inquiry based science supported by technology: Achievement among urban middle school students. Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans.

    Google Scholar 

  • Kuhn, D. (1991). The skills of argument. Cambridge, MA: Cambridge Press.

    Book  Google Scholar 

  • Kuhn, D. (2005). Education for thinking. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Kuhn, D., Schauble, L., & Garcia-Mila, M. (1992). Cross-domain development of scientific reasoning. Cognition and Instruction, 9(4), 285–327.

    Article  Google Scholar 

  • Kuhn, D., Garcia-Mila, M., Zohar, A., Andersen, C., White, S., Klahr, D., & Carver, S. (1995). Strategies of knowledge acquisition. Monographs of the Society for Research in Child Development, 60(4), 1–157.

    Article  Google Scholar 

  • Li, H., Gobert, J., & Dickler, R. (2017). Dusting off the messy middle: Assessing students’ inquiry skills through doing and writing. In E. André, R. Baker, X. Hu, M. Rodrigo, & B. du Boulay (Eds.), Lecture Notes in Computer Science (Vol. 10331, pp. 175–187). Cham: Springer.

    Google Scholar 

  • McElhaney, K., & Linn, M. (2008). Impacts of students’ experimentation using a dynamic visualization on their understanding of motion. In Proceedings of the 8th international conference of the learning sciences (pp. 51–58). Netherlands: International Society of the Learning Sciences.

    Google Scholar 

  • McElhaney, K., & Linn, M. (2010). Helping students make controlled experiments more informative. In K. Gomez, L. Lyons, & J. Radinsky (Eds.), Learning in the disciplines: Proceedings of the 9th international conference of the learning sciences (pp. 786–793). Chicago: International Society of the Learning Sciences.

    Google Scholar 

  • McNeill, K. L., & Krajcik, J. (2007). Middle school students’ use of appropriate and inappropriate evidence in writing scientific explanations. In M. Lovett & P. Shah (Eds.), Thinking with data (pp. 233–265). New York: Taylor & Francis Group, LLC.

    Google Scholar 

  • McNeill, K. L., & Krajcik, J. S. (2011). Supporting grade 5–8 students in constructing explanations in science: The claim, evidence, and reasoning framework for talk and writing. Upper Saddle River: Pearson.

    Google Scholar 

  • McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students’ construction of scientific explanations by fading scaffolds in instructional materials. The Journal of the Learning Sciences, 15(2), 153–191.

    Article  Google Scholar 

  • Mislevy, R. J., Behrens, J. T., DiCerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence centered design, psychometrics, and data mining. Journal of Educational Data Mining, 4(1), 11–48.

    Google Scholar 

  • Moussavi, R., Kennedy, M., Sao Pedro, M. A., & Gobert, J. D. (2015). Evaluating a scaffolding design to support students’ data interpretation skills within a simulation-based inquiry environment. Presented at the meeting of the American Education Research Association, Chicago.

    Google Scholar 

  • Moussavi, R., Sao Pedro, M., & Gobert, J. D. (2016a). Evaluating the efficacy of real-time scaffolding for data interpretation skills. Paper presented at the meeting of the American Education Research Association, Washington, DC.

    Google Scholar 

  • Moussavi, R., Sao Pedro, M., & Gobert, J. D. (2016b). The effect of scaffolding on the immediate transfer of students’ data interpretation skills within science topics. Presented at the 12th International Conference of the Learning Sciences, Singapore.

    Google Scholar 

  • National Research Council. (2011). Successful K-12 STEM education: Identifying effective approaches in science, technology, engineering, and mathematics. Washington, D.C.: National Academies Press.

    Google Scholar 

  • Next Generation Science Standards Lead States. (2013). Next generation science standards: For states, by states. Washington, DC: The National Academies Press.

    Google Scholar 

  • Njoo, M., & de Jong, T. (1993). Exploratory learning with a computer simulations for control theory: Learning processes and instructional support. Journal of Research in Science Teaching, 30, 821–844.

    Article  Google Scholar 

  • Organization for Economic Cooperation and Development. (2018). PISA 2015 results in focus: What 15-year-olds know and what they can do with what they know. Paris: Organization for Economic Cooperation and Development.

    Google Scholar 

  • Pellegrino, J., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.

    Google Scholar 

  • Quinn, J., & Alessi, S. (1994). The effects of simulation complexity and hypothesis-generation strategy on learning. Journal of Research on Computing in Education, 27(1), 75–91.

    Article  Google Scholar 

  • Reimann, P. (1991). Detecting functional relations in a computerized discovery environment. Learning and Instruction, 1(1), 45–65.

    Article  Google Scholar 

  • Roll, I., Yee, N., & Briseno, A. (2014). Students’ adaptation and transfer of strategies across levels of scaffolding in an exploratory environment. In Intelligent tutoring systems (pp. 348–353). Switzerland: Springer International Publishing.

    Chapter  Google Scholar 

  • Sao Pedro, M. (2013). Real-time assessment, prediction, and scaffolding of middle school students’ data collection skills within physical science simulations (Doctoral dissertation). Worcester: Worcester Polytechnic Institute.

    Google Scholar 

  • Sao Pedro, M. A., Baker, R. S., Montalvo, O., Nakama, A., & Gobert, J. D. (2010). Using text replay tagging to produce detectors of systematic experimentation behavior patterns. In R. Baker, A. Merceron, & P. Pavlik (Eds.), Proceedings of the 3rd international conference on educational data mining (pp. 181–190).

    Google Scholar 

  • Sao Pedro, M., Baker, R., & Gobert, J. (2012a). Improving construct validity yields better models of systematic inquiry, even with less information. In Proceedings of the 20th conference on user modeling, adaptation, and personalization (pp. 249–260). Berlin: Springer.

    Chapter  Google Scholar 

  • Sao Pedro, M., Gobert, J., & Baker, R. (2012b). Assessing the learning and transfer of data collection inquiry skills using educational data mining on students’ log files. Paper presented at The Annual Meeting of the American Educational Research Association, Vancouver.

    Google Scholar 

  • Sao Pedro, M., Baker, R., & Gobert, J. (2013a). Incorporating scaffolding and tutor context into bayesian knowledge tracing to predict inquiry skill acquisition. In S. K. D’Mello, R. A. Calvo, & A. Olney (Eds.), Proceedings of the 6th international conference on educational data mining (pp. 185–192).

    Google Scholar 

  • Sao Pedro, M., Baker, R., & Gobert, J. (2013b). What different kinds of stratification can reveal about the generalizability of data-mined skill assessment models. In Proceedings of the 3rd conference on learning analytics and knowledge.

    Google Scholar 

  • Sao Pedro, M., Baker, R., Gobert, J., Montalvo, O., & Nakama, A. (2013c). Leveraging machine-learned detectors of systematic inquiry behavior to estimate and predict transfer of inquiry skill. User Modeling and User-Adapted Interaction, 23(1), 1–39.

    Article  Google Scholar 

  • Sao Pedro, M. A., Gobert, J. D., & Betts, C. G. (2014). Towards scalable assessment of performance-based skills: Generalizing a detector of systematic science inquiry to a simulation with a complex structure. In Intelligent tutoring systems (pp. 591–600). Switzerland: Springer International Publishing.

    Chapter  Google Scholar 

  • Sao Pedro, M., Gobert, J., Toto, E., & Paquette, L. (2015). Assessing transfer of students’ data analysis skills across physical science simulations. In I. Bejar (Chair), The state of the art in automated scoring of science inquiry tasks. Symposium conducted at the meeting of the American Education Research Association, Chicago.

    Google Scholar 

  • Schauble, L. (1990). Belief revision in children: The role of prior knowledge and strategies for generating evidence. Journal of Experimental Child Psychology, 49, 31–57.

    Article  Google Scholar 

  • Schauble, L., Klopfer, L. E., & Raghavan, K. (1991). Students’ transition from an engineering model to a science model of experimentation. Journal of Research in Science Teaching, 28(9), 859–882.

    Article  Google Scholar 

  • Schauble, L., Glaser, R., Duschl, R. A., Schulze, S., & John, J. (1995). Students’ understanding of the objectives and procedures of experimentation in the science classroom. The Journal of the Learning Sciences, 4(2), 131–166.

    Article  Google Scholar 

  • Schneider, R., Krajcik, J., & Blumenfeld, P. (2005). Enacting reform-based science materials: The range of teacher enactments in reform classrooms. Journal of Research in Science Teaching, 42(3), 283–312.

    Article  Google Scholar 

  • Schunn, C. D., & Anderson, J. R. (1998). Scientific discovery. In J. R. Anderson (Ed.), The atomic components of thought (pp. 385–428). Mahwah: Lawrence Erlbaum Associates.

    Google Scholar 

  • Schunn, C. D., & Anderson, J. R. (1999). The generality/specificity of expertise in scientific reasoning. Cognitive Science, 23(3), 337–370.

    Article  Google Scholar 

  • Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.

    Article  Google Scholar 

  • Shute, V., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery world: Smithtown. Interactive Learning Environments, 1, 55–71.

    Article  Google Scholar 

  • Staer, H., Goodrum, D., & Hackling, M. (1998). High school laboratory work in Western Australia: Openness to inquiry. Research in Science Education, 28(2), 219–228.

    Article  Google Scholar 

  • Toulmin, S. (1958). The uses of argument. New York: Cambridge University Press.

    Google Scholar 

  • Tsirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51(1), 1–10.

    Article  Google Scholar 

  • van Joolingen, W. R., & de Jong, T. (1991a). Supporting hypothesis generation by learners exploring an interactive computer simulation. Instructional Science, 20(5), 389–404.

    Article  Google Scholar 

  • van Joolingen, W. R., & de Jong, T. (1991b). Characteristics of simulations for instructional settings. Education and Computing, 6(3-4), 241–262.

    Article  Google Scholar 

  • van Joolingen, W. R., & de Jong, T. (1993). Exploring a domain through a computer simulation: Traversing variable and relation space with the help of a hypothesis scratchpad. In D. Towne, T. de Jong, & H. Spada (Eds.), Simulation-based experiential learning (pp. 191–206). Berlin: Springer.

    Chapter  Google Scholar 

  • van Joolingen, W. R., & de Jong, T. (1997). An extended dual search space model of scientific discovery learning. Instructional Science, 25(5), 307–346.

    Article  Google Scholar 

  • Williamson, D., Mislevy, R., & Bejar, I. (2006). Automated scoring of complex tasks in computer-based testing. Mahwah: Lawrence Erlbaum Associates.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Janice D. Gobert .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Gobert, J.D., Moussavi, R., Li, H., Sao Pedro, M., Dickler, R. (2018). Real-Time Scaffolding of Students’ Online Data Interpretation During Inquiry with Inq-ITS Using Educational Data Mining. In: Auer, M., Azad, A., Edwards, A., de Jong, T. (eds) Cyber-Physical Laboratories in Engineering and Science Education. Springer, Cham. https://doi.org/10.1007/978-3-319-76935-6_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-76935-6_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-76934-9

  • Online ISBN: 978-3-319-76935-6

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics