Analysis and Prediction of Student Emotions While Doing Programming Exercises

  • Thomas James Tiam-LeeEmail author
  • Kaoru Sumi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11528)


The modeling of student emotions has recently considerable interest in the field of intelligent tutoring systems. However, most approaches are applied in typical interaction models characterized by frequent communication or dialogue between the student and the tutoring model. In this paper, we analyze emotions while students are writing computer programs without any human or agent communication to induce displays of affect. We use a combination of features derived from typing logs, compilation logs, and a video of the students’ face while solving coding exercises and determine how they can be used to predict affect. We find that combining pose-based, face-based, and log-based features can train models that predict affect with good accuracy above chance levels and that certain features are discriminative in this task.


Student modeling Affective computing Programming 



The authors would like to thank Mr. Fritz Flores, Mr. Manuel Toleran, and Mr. Kayle Tiu for assisting in the facilitation of the data collection process in the Philippines.


  1. 1.
    Baltrusaitis, T., Mahmoud, M., Robinson, P.: Cross-dataset learning and person-specific normalisation for automatic action unit detection. In: 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), vol. 6, pp. 1–6. IEEE (2015)Google Scholar
  2. 2.
    Baltrusaitis, T., Zadeh, A., Lim, Y.C., Morency, L.P.: Openface 2.0: facial behavior analysis toolkit. In: 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), pp. 59–66. IEEE (2018)Google Scholar
  3. 3.
    Bosch, N., Chen, Y., D’Mello, S.: It’s written on your face: detecting affective states from facial expressions while learning computer programming. In: Trausan-Matu, S., Boyer, K.E., Crosby, M., Panourgia, K. (eds.) ITS 2014. LNCS, vol. 8474, pp. 39–44. Springer, Cham (2014). Scholar
  4. 4.
    Bosch, N., D’Mello, S.: Sequential patterns of affective states of novice programmers. In: The First Workshop on AI-supported Education for Computer Science (AIEDCS 2013), pp. 1–10 (2013)Google Scholar
  5. 5.
    Bosch, N., D’Mello, S., Mills, C.: What emotions do novices experience during their first computer programming learning session? In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS (LNAI), vol. 7926, pp. 11–20. Springer, Heidelberg (2013). Scholar
  6. 6.
    Cho, M.H., Heron, M.L.: Self-regulated learning: the role of motivation, emotion, and use of learning strategies in students learning experiences in a self-paced online mathematics course. Distance Educ. 36(1), 80–99 (2015)CrossRefGoogle Scholar
  7. 7.
    Crow, T., Luxton-Reilly, A., Wuensche, B.: Intelligent tutoring systems for programming education: a systematic review. In: Proceedings of the 20th Australasian Computing Education Conference, pp. 53–62. ACM (2018)Google Scholar
  8. 8.
    Daniels, L.M., Stupnisky, R.H., Pekrun, R., Haynes, T.L., Perry, R.P., Newall, N.E.: A longitudinal analysis of achievement goals: from affective antecedents to emotional effects and achievement outcomes. J. Educ. Psychol. 101(4), 948 (2009)CrossRefGoogle Scholar
  9. 9.
    D’Mello, S., Kory, J.: Consistent but modest: a meta-analysis on unimodal and multimodal affect detection accuracies from 30 studies. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, pp. 31–38. ACM (2012)Google Scholar
  10. 10.
    Ekman, P., Friesen, W.V.: Unmasking the Face: A Guide to Recognizing Emotions from Facial Cues (1975)Google Scholar
  11. 11.
    Grafsgaard, J.F., Boyer, K.E., Lester, J.C.: Predicting facial indicators of confusion with hidden Markov models. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011. LNCS, vol. 6974, pp. 97–106. Springer, Heidelberg (2011). Scholar
  12. 12.
    Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Automatically recognizing facial indicators of frustration: a learning-centric analysis. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 159–165. IEEE (2013)Google Scholar
  13. 13.
    Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Embodied affect in tutorial dialogue: student gesture and posture. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS (LNAI), vol. 7926, pp. 1–10. Springer, Heidelberg (2013). Scholar
  14. 14.
    Harley, J.M., Lajoie, S.P., Frasson, C., Hall, N.C.: Developing emotion-aware, advanced learning technologies: a taxonomy of approaches and features. Int. J. Artif. Intell. Educ. 27(2), 268–297 (2017)CrossRefGoogle Scholar
  15. 15.
    Mega, C., Ronconi, L., De Beni, R.: What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. J. Educ. Psychol. 106(1), 121 (2014)CrossRefGoogle Scholar
  16. 16.
    Petrovica, S., Anohina-Naumeca, A., Ekenel, H.K.: Emotion recognition in affective tutoring systems: collection of ground-truth data. Procedia Comput. Sci. 104, 437–444 (2017)CrossRefGoogle Scholar
  17. 17.
    Tiam-Lee, T.J., Sumi, K.: A comparison of Filipino and Japanese facial expressions and hand gestures in relation to affective states in programming sessions. In: Workshop on Computation: Theory and Practice 2017 (2017)Google Scholar
  18. 18.
    Tiam-Lee, T.J., Sumi, K.: Adaptive feedback based on student emotion in a system for programming practice. In: Nkambou, R., Azevedo, R., Vassileva, J. (eds.) ITS 2018. LNCS, vol. 10858, pp. 243–255. Springer, Cham (2018). Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Future University HakodateHakodateJapan

Personalised recommendations