Looking Beyond Transfer Models: Finding Other Sources of Power for Student Models

  • Yue Gong
  • Joseph E. Beck
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6787)


Student modeling plays an important role in educational research. Many techniques have been developed focusing on accurately estimating student performances. In this paper, using Performance Factors Analysis as our framework, we examine what components of the model enable us to better predict, and consequently better understand, student performance. Using transfer models to predict is very common across different student modeling techniques, as student proficiencies on those required skills are believed, to a large degree, to determine student performance. However, we found that problem difficulty is an even more important predictor than student knowledge of the required skills. In addition, we found that using student proficiencies across all skills works better than just using those skills thought relevant by the transfer model. We tested our proposed models with two transfer models of fine- and coarse-grain sizes; the results suggest that the improvement is not simply an illusion due to possible mistakes in associating skills with problems.


performance factors analysis question difficulty student overall proficiencies predicting student performance 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Koedinger, K.R., Anderson, J.R., Hadley, W.H., Mark, M.A.: Intelligent Tutoring Goes to School in the Big City. Int. J. Artificial Intelligence in Education (1997)Google Scholar
  2. 2.
    Harrigan, M., Kravčík, M., Steiner, C., Wade, V.: What Do Academic Users Really Want from an Adaptive Learning System? In: Houben, G.-J., McCalla, G., Pianesi, F., Zancanaro, M. (eds.) UMAP 2009. LNCS, vol. 5535, pp. 454–460. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  3. 3.
    Corbett, A.T., Anderson, J.R.: Knowledge Tracing: Modeling the Acquisition of Procedural Knowledge. User Modeling and User-Adapted Interaction 4, 253–278 (1995)CrossRefGoogle Scholar
  4. 4.
    Reye, J.: Student Modelling Based on Belief Networks. International Journal of Artificial Intelligence in Education 14, 63–96 (2004)Google Scholar
  5. 5.
    Cen, H., Koedinger, K.R., Junker, B.: Learning Factors Analysis – A General Method for Cognitive Model Evaluation and Improvement. In: Ikeda, M., Ashley, K.D., Chan, T.-W. (eds.) ITS 2006. LNCS, vol. 4053, pp. 164–175. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  6. 6.
    Pavlik, P.I., Cen, H., Koedinger, K.: Learning Factors Transfer Analysis: Using Learning Curve Analysis to Automatically Generate Domain Models. In: Proceedings of the 2rd International Conference on Educational Data Mining, pp. 121–130 (2009)Google Scholar
  7. 7.
    Barnes, T.: The q-matrix method: Mining student response data for knowledge. In: Proceedings of the AAAI 2005 Workshop on Educational Data Mining (2005)Google Scholar
  8. 8.
    Gong, Y., Beck, J.E., Heffernan, N.T.: How to Construct More Accurate student models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis. International Journal of Artificial Intelligence in Education (to appear)Google Scholar
  9. 9.
    Winters, T.: Topic Extraction from Item-Level Grades. In: AAAI 2005 Workshop on Educational Data Mining (2005)Google Scholar
  10. 10.
    Pardos, Z.A., Heffernan, N.T.: Modeling individualization in a bayesian networks implementation of knowledge tracing. In: De Bra, P., Kobsa, A., Chin, D. (eds.) UMAP 2010. LNCS, vol. 6075, pp. 255–266. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  11. 11.
    Beck, J.E., Mostow, J.: How who should practice: Using learning decomposition to evaluate the efficacy of different types of practice for different types of students. In: Woolf, B.P., Aïmeur, E., Nkambou, R., Lajoie, S. (eds.) ITS 2008. LNCS, vol. 5091, pp. 353–362. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  12. 12.
    Razzaq, L., Feng, M., Nuzzo-Jones, G., Heffernan, N.T., Koedinger, K.R., Junker, B., Ritter, S., Knight, A., Aniszczyk, C., Choksey, S., Livak, T., Mercado, E., Turner, T.E., Upalekar, R., Walonoski, J.A., Macasek, M.A., Rasmussen, K.P.: The Assistment project: Blending assessment and assisting. In: Proceedings of the 12th Artificial Intelligence in Education, pp. 555–562 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Yue Gong
    • 1
  • Joseph E. Beck
    • 1
  1. 1.Computer Science DepartmentWrcester Polytechnic InstituteWorcesterUSA

Personalised recommendations