Skip to main content

Looking Beyond Transfer Models: Finding Other Sources of Power for Student Models

  • Conference paper
User Modeling, Adaption and Personalization (UMAP 2011)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 6787))

Abstract

Student modeling plays an important role in educational research. Many techniques have been developed focusing on accurately estimating student performances. In this paper, using Performance Factors Analysis as our framework, we examine what components of the model enable us to better predict, and consequently better understand, student performance. Using transfer models to predict is very common across different student modeling techniques, as student proficiencies on those required skills are believed, to a large degree, to determine student performance. However, we found that problem difficulty is an even more important predictor than student knowledge of the required skills. In addition, we found that using student proficiencies across all skills works better than just using those skills thought relevant by the transfer model. We tested our proposed models with two transfer models of fine- and coarse-grain sizes; the results suggest that the improvement is not simply an illusion due to possible mistakes in associating skills with problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Koedinger, K.R., Anderson, J.R., Hadley, W.H., Mark, M.A.: Intelligent Tutoring Goes to School in the Big City. Int. J. Artificial Intelligence in Education (1997)

    Google Scholar 

  2. Harrigan, M., Kravčík, M., Steiner, C., Wade, V.: What Do Academic Users Really Want from an Adaptive Learning System? In: Houben, G.-J., McCalla, G., Pianesi, F., Zancanaro, M. (eds.) UMAP 2009. LNCS, vol. 5535, pp. 454–460. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  3. Corbett, A.T., Anderson, J.R.: Knowledge Tracing: Modeling the Acquisition of Procedural Knowledge. User Modeling and User-Adapted Interaction 4, 253–278 (1995)

    Article  Google Scholar 

  4. Reye, J.: Student Modelling Based on Belief Networks. International Journal of Artificial Intelligence in Education 14, 63–96 (2004)

    Google Scholar 

  5. Cen, H., Koedinger, K.R., Junker, B.: Learning Factors Analysis – A General Method for Cognitive Model Evaluation and Improvement. In: Ikeda, M., Ashley, K.D., Chan, T.-W. (eds.) ITS 2006. LNCS, vol. 4053, pp. 164–175. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  6. Pavlik, P.I., Cen, H., Koedinger, K.: Learning Factors Transfer Analysis: Using Learning Curve Analysis to Automatically Generate Domain Models. In: Proceedings of the 2rd International Conference on Educational Data Mining, pp. 121–130 (2009)

    Google Scholar 

  7. Barnes, T.: The q-matrix method: Mining student response data for knowledge. In: Proceedings of the AAAI 2005 Workshop on Educational Data Mining (2005)

    Google Scholar 

  8. Gong, Y., Beck, J.E., Heffernan, N.T.: How to Construct More Accurate student models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis. International Journal of Artificial Intelligence in Education (to appear)

    Google Scholar 

  9. Winters, T.: Topic Extraction from Item-Level Grades. In: AAAI 2005 Workshop on Educational Data Mining (2005)

    Google Scholar 

  10. Pardos, Z.A., Heffernan, N.T.: Modeling individualization in a bayesian networks implementation of knowledge tracing. In: De Bra, P., Kobsa, A., Chin, D. (eds.) UMAP 2010. LNCS, vol. 6075, pp. 255–266. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  11. Beck, J.E., Mostow, J.: How who should practice: Using learning decomposition to evaluate the efficacy of different types of practice for different types of students. In: Woolf, B.P., Aïmeur, E., Nkambou, R., Lajoie, S. (eds.) ITS 2008. LNCS, vol. 5091, pp. 353–362. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  12. Razzaq, L., Feng, M., Nuzzo-Jones, G., Heffernan, N.T., Koedinger, K.R., Junker, B., Ritter, S., Knight, A., Aniszczyk, C., Choksey, S., Livak, T., Mercado, E., Turner, T.E., Upalekar, R., Walonoski, J.A., Macasek, M.A., Rasmussen, K.P.: The Assistment project: Blending assessment and assisting. In: Proceedings of the 12th Artificial Intelligence in Education, pp. 555–562 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Gong, Y., Beck, J.E. (2011). Looking Beyond Transfer Models: Finding Other Sources of Power for Student Models. In: Konstan, J.A., Conejo, R., Marzo, J.L., Oliver, N. (eds) User Modeling, Adaption and Personalization. UMAP 2011. Lecture Notes in Computer Science, vol 6787. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22362-4_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-22362-4_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-22361-7

  • Online ISBN: 978-3-642-22362-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics