Skip to main content
Log in

Augmenting teacher-student interaction in digital learning through affective computing

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Interactions between teachers and students can be effectively enhanced if teachers can capture the spontaneous nonverbal behaviors (e.g., facial expressions and body language) of their students in real time, thereby effectively improving teaching strategies and the learning effectiveness of students. In this study, we implemented an expression–response analysis system (ERAS) to analyze facial expressions. The ERAS employs a web camera to capture the facial images of students. Their facial expressions are analyzed to assess their attitude toward progressively more difficult course content, and to determine the relationship between their social interactions and learning effectiveness. The ERAS identified 10 facial feature points that form 11 facial action units (AUs). Subsequently, the AUs were classified as positive, neutral, and negative social interactions by applying a rule-based expert system, and cognitive load theory was applied to verify the classifications. The experimental results showed that student with high coding abilities could adapt to the multimedia digital learning content, as evidenced by the comparatively higher expression of neutral and positive social interactions, whereas students with low coding abilities reported a higher frequency of negative social interactions resulting from the increase in cognitive load. Simultaneously, the real time detection of social interactions can provide a basis for diagnosing student learning difficulties and assist teachers in adjusting their teaching strategies.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

References

  1. Bolme DS, Strout M, Beveridge JR (2007) FacePerf: benchmarks for face recognition. Proc Int Symp Workload Character: 114–119

  2. Bousmalis K, Mehu M, Pantic M (2013) Towards the automatic detection of spontaneous agreement and disagreement based on non-verbal behaviour: a survey of related cues, databases, and tools. Image Vis Comput 31(2):203–221

    Article  Google Scholar 

  3. Brunken R, Plass J, Leutner D (2003) Direct measurement of cognitive load in multimedia learning. Educ Psychol 38(1):53–61

    Article  Google Scholar 

  4. Ekman P (2003) Emotions revealed: understanding faces and feelings. Weidenfeld & Nicolson

  5. Ekman P, Friesen WV (1978) The facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists Press, San Francisco

    Google Scholar 

  6. Gerjets P, Scheiter K (2003) Goal configurations and processing strategies as moderators between instructional design and cognitive load: evidence from hypertext-based instruction. Educ Psychol 38(1):33–41

    Article  Google Scholar 

  7. Ghanem K, Caplier A An efficient method to classify positive and negative expressions. Int J Tomography Stat 17(11)72–78

  8. Lin K-C, Lin R-W, Chen S-J, You C-R, Chai J-L The classroom response system based on affective computing. Third IEEE Int Conf Ubi-Med Comput, Zhejiang Normal University, Jinhua, China

  9. Lin K-C, Chen S-J (2012) Affective computing in distance learning: based on facial emotion recognition. 2nd Int Conf Future Comput Educ, Shanghai, China

  10. Loh MP, Wong YP, Wong CO (2006) Facial expression recognition for E-learning systems using gabor wavelet & neural network. Proc Int Conf Adv Learn Technol

  11. Mayer RE (2005) The Cambridge handbook of multimedia learning. Cambridge University Press, NY

    Book  Google Scholar 

  12. Nosu K, Kurokawa T (2006) Facial tracking for an emotion-diagnosis robot to support E-learning. Proc Int Conf Mach Learn Cybernet: 3811–3816

  13. Picard RW (1995) Affective computing. Technical Report TR-321, MIT, Media Laboratory

  14. Shen L, Wang M, Shen R (2009) Affective e-learning: using “emotional” data to improve learning in pervasive learning environment. Educ Technol Soc 12(2):176–189

    Google Scholar 

  15. Shen X, Wu Q, Fu X (2012) Effects of duration of expressions on the rercognition of microexpressions. J Zhejiang Univ-Sci B: Biomed Biotechnol 13(3):221–230

    Article  Google Scholar 

  16. Sweller J, van Merrierboer J, Paas F (1998) Cognitive architecture and instructional design. Educ Psychol Rev 10(3):251–296

    Article  Google Scholar 

  17. Vinciarelli A, Pantic M, Heylen D, Pelachaud C, Poggi I, D’ericco F, Schroeder M (2012) Bridging the gap between social animal and unsocial machine: a survey of social signal processing. IEEE Trans Affect Comput 3(1):69–87

    Article  Google Scholar 

  18. Whitehill J, Omlin CW (2006) Haar features for FACS AU recognition. Proc Int Conf Autom Face Gesture Recognit: 5–9

  19. Wu Y, Wang TT, Chu XN (2009) Affective modeling and recognition of learning emotion: application to E-learning. J Software 4(8):859–866

    Google Scholar 

  20. Zhang Y, Ji Q (2005) Active and dynamic information fusion for facial expression understanding from image sequences. IEEE Trans Pattern Anal Mach Intell 27(5):699–714

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kuan-Cheng Lin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hung, J.CS., Chiang, KH., Huang, YH. et al. Augmenting teacher-student interaction in digital learning through affective computing. Multimed Tools Appl 76, 18361–18386 (2017). https://doi.org/10.1007/s11042-016-4101-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-4101-z

Keywords

Navigation