Skip to main content
Log in

Facial emotion recognition with transition detection for students with high-functioning autism in adaptive e-learning

  • Methodologies and Application
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Emotions deeply affect learning achievement. In the case of students with high-functioning autism (HFA), negative emotions such as anxiety and anger can impair the learning process due to the inability of these individuals to control their emotions. Attempts to regulate negative emotions in HFA students once they have occurred, subsequent regulation to HFA students is often ineffective because it is difficult to calm them down. Hence, detecting emotional transitions and providing adaptive emotional regulation strategies in a timely manner to regulate negative emotions can be especially important for students with HFA in an e-learning environment. In this study, a facial expression-based emotion recognition method with transition detection was proposed. An emotion elicitation experiment was performed to collect facial-based landmark signals for the purpose of building classifiers of emotion recognition. The proposed method used sliding window technique and support vector machine (SVM) to build classifiers in order to recognize emotions. For the purpose of determining robust features for emotion recognition, Information Gain (IG) and Chi-square were used for feature evaluations. The effectiveness of classifiers with different parameters of sliding windows was also examined. The experimental results confirmed that the proposed method has sufficient discriminatory capability. The recognition rates for basic emotions and transitional emotions were 99.13 and 92.40%, respectively. Also, through feature selection, training time was accelerated by 4.45 times, and the recognition rates for basic emotions and transitional emotions were 97.97 and 87.49%, respectively. The method was applied in an adaptive e-learning environment for mathematics to demonstrate its application effectiveness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24

Similar content being viewed by others

References

  • American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders: DSM-IV-TR. (2000)

  • Bailenson JN, Pontikakis ED, Mauss IB, Gross JJ, Jabon ME, Hutcherson CAC, Nass C, John O (2008) Real-time classification of evoked emotions using facial feature tracking and physiological responses. Int J Hum Comput Stud 66(5):303–317

    Article  Google Scholar 

  • Bänziger T, Mortillaro M, Scherer KR (2012) Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. Emotion 12(5):1161–1179. doi:10.1037/a0025827

  • Baron-Cohen S (2000) Theory of mind and autism: a fifteen-year review. In: Baron-Cohen S, Tager-Flusberg H, Cohen DJ (eds) Understanding other minds: perspectives form developmental cognitive neuroscience. Oxford University Press, New York, pp 3–21

    Google Scholar 

  • Becker C, Kopp S, Wachsmuth I (2004) Simulating the emotion dynamics of a multimodal conversational agent affective dialogue systems. Lecture Notes in Computer Science 3068:154–165

  • Bieberich AA, Morgan SB (2004) Self-regulation and affective expression during play in children with Autism or down syndrome: a short-term longitudinal study. Autism Dev Disord 34(4):439–448

    Article  Google Scholar 

  • Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155

    Article  Google Scholar 

  • Cheng Y, Ye J (2010) Exploring the social competence of students with autism spectrum conditions in a collaborative virtual learning environment: The pilot study. Comput Educ 54:1068–1077

    Article  Google Scholar 

  • Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor JG (2001) Emotion recognition in human–computer interaction. Sig Process Mag IEEE 18(1):32–80

    Article  Google Scholar 

  • Czapinski P, Bryson SE (2003) Reduced facial muscles movements in autism: evidence for dysfunction in the neuromuscular pathway. Brain Cogn 51(2):177–179

    Google Scholar 

  • Dhir CS, Iqbal N, Lee SY (2007) Efficient feature selection based on information gain criterion for face recognition. In: 2007 international conference on information acquisition, Seogwipo-si, pp 523–527

  • Granhag P, Strömwall L (2004) The detection of deception in forensic contexts. Cambridge University Press, Cambridge, p 269

    Book  Google Scholar 

  • Herskowitz V (2000) Computer-based therapy for individuals with Autism. Adv Mag, the Nation’s Speech-Language and Audiology Weekly

  • Ho MH (1999) The selection and use of strategies for everyday problem solving by high-functioning adolescents with autism. Unpublished doctoral dissertation, University of Texas, Austin

  • Hogarth RM, Einhorn HJ (1992) Order effects in belief updating: the belief-adjustment model. Cogn Psychol 24:1–55

    Article  Google Scholar 

  • Huang CL, Chen MC, Wang CJ (2007) Credit scoring with a data mining approach based on support vector machines. Expert Syst Appl 33(4):847–856

    Article  Google Scholar 

  • Jang G-S, Lai F, Jiang B-W, Parng T-M, Chien L-H (1993) Intelligent stock trading system with price trend prediction and reversal recognition using dual-module neural networks. Appl Intell 3(3):225–248

    Article  Google Scholar 

  • Janssen JH, Tacken P, de Vries JJG, van den Broek EL, Westerink JHDM, Haselager P, IJsselsteijn WA (2013) Machines outperform laypersons in recognizing emotions elicited by autobiographical recollection. Human Comput Interact 28(6):479–517

  • Kanade T, Cohn J, Tian YL (2000) Comprehensive database for facial expression analysis. In: Proceedings of 4th IEEE international conference of automated face gesture recognition, pp 46–53

  • Kapoor A, Burleson W, Picard WR (2007) Automatic prediction of frustration. Int J Hum Comput Stud 65(2007):724–736

    Article  Google Scholar 

  • Kapoor A, Picard RW (2002) Real-time, fully automatic upper facial feature tracking. In: Proceedings of the 5th international conference on automatic face and gesture recognition, Washington, DC, 20–21 May 2002

  • Kirk RE (1995) Experimental design: procedures for the behavioral sciences, 3rd edn. Brooks/Cole, Belmont

    MATH  Google Scholar 

  • Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the fourteenth international joint conference on artificial intelligence 12. Morgan Kaufmann, San Mateo, pp 1137–1143

  • Liming Z (1993) Models and applications of artificial neural networks. Fudan University, Shanghai, p 50

    Google Scholar 

  • Macfie HJ, Bratchell N, Greenhoff H, Vallis LV (1989) Designs to balance the effect of order of presentation and first-order carry-over effects in hall test. J Sens Stud 4:129–149

    Article  Google Scholar 

  • Mazzocco MM, Myers G (2003) Complexities in identifying and defining mathematics learning disability in the primary school-age years. Ann Dyslexia 53:218–253

    Article  Google Scholar 

  • Mehrabian A, Wiener M (1967) Decoding of inconsistent communications. J Pers Soc Psychol 6(1):109–114

    Article  Google Scholar 

  • Ogiela L (2010) Computational intelligence in cognitive healthcare information systems. In: Bichindaritz I, Vaidya S, Jain A, et al. (Eds.), Computational intelligence in healthcare 4: advanced methodologies, Book Series: studies in computational intelligence, Vol. 309, pp 347-369

  • Ogiela L (2013) Semantic analysis and biological modelling in selected classes of cognitive information systems. Math Comput Model 58(5–6):1405–1414

    Article  Google Scholar 

  • Ogiela L (2013) Cognitive informatics in image semantics description, identification and automatic pattern understanding. Neurocomputing 122:58–69

  • Pan J, Tompkins WJ (1985) A real-time QRS detection algorithm. IEEE Trans Biomed Eng BME–32(3):230–236

    Article  Google Scholar 

  • Provost F, Fawcett T (2001) Robust classification for imprecise environments. Mach Learn 42:203–231

    Article  MATH  Google Scholar 

  • Quinlan JR (1979) Discovering rules from large collections of examples: a case study. In: Michie D (ed) Expert systems in the microelectronic age. Edinburgh University Press, Edinburgh, pp 168–201

  • Reaven J (2009) Children with high-functioning Autism Spectrum Disorders and co-occurring anxiety symptoms: implications for assessment and treatment. Spec Pediatr Nurs 14(3):192–199

    Article  Google Scholar 

  • Rivera RA, Castillo R, Chae O (2013) Local directional number pattern for face analysis: face and expression recognition. IEEE Trans Image Process 22(5):1740–1752

    Article  MathSciNet  MATH  Google Scholar 

  • Senechal T, Rapp V, Salam H, Seguier R, Bailly K, Prevost L (2012) Facial action recognition combining heterogeneous features via multikernel learning. IEEE Trans Syst Man Cybern Part B Cybern 42(4):993–1005

    Article  Google Scholar 

  • Siegel S (1956) Non-parametric statistics for the behavioral sciences. McGraw-Hill, New York

    MATH  Google Scholar 

  • Song M, You M, Li N, Chen C (2008) A robust multimodal approach for emotion recognition. Neurocomputing 71(10–12):1913–1920

    Article  Google Scholar 

  • Swets J (1988) Measuring the accuracy of diagnostic systems. Science 240:1285–1293

    Article  MathSciNet  MATH  Google Scholar 

  • Swezey S (2003) Book reviews-autism and ICT: a guide for teachers and parents. Comput Educ 40:95–96

    Article  Google Scholar 

  • Tariq U, Lin K-H, Li Z, Zhou X, Wang Z, Le V, Huang TS, Lv X, Han TX (2012) Recognizing emotions from an ensemble of features. IEEE Trans Syst Man Cybern Part B Cybern 42(4):1017–1026

    Article  Google Scholar 

  • Valstar M, Jiang B, Mhu M, Pantic M, Scherer K (2011) The first facial expression recognition and analysis challenge. In: Proceedings of IEEE international conference of automatic face and gesture recognition (in print)

  • Vapnik V, Golowich SE, Smola A (1996) Support vector method for function approximation, regression estimation, and signal processing. Adv Neural Inf Process Syst 9:281–287

    Google Scholar 

  • Vullamparthi AJ, Khargharia HS, Bindhumadhava BS, Babu NSC (2011) A smart tutoring aid for the autistic—educational aid for learners on the Autism Spectrum. In: IEEE international conference on technology for education. Los Alamitos: IEEE Computer Society, pp 43–50

  • Wainer A, Ingersoll B (2011) The use of innovative computer technology for teaching social communication to individuals with autism spectrum disorders. Res Autism Spectr Disord 5:96–107

    Article  Google Scholar 

  • Yirmiya N, Kasari C, Sigman M, Mundy P (1989) Facial expression of affect in autistic, mentally retarded and normal children. Child Psychol Psychiatry 30:725–735

    Article  Google Scholar 

  • Zintel E (2008) Tools and products. IEEE Comput Graphics Appl 28(6):14–17

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuh-Min Chen.

Ethics declarations

Conflict of Interest

Hui-Chan Chu declares that she has no conflict of interest. William Wei-Jen Tsai declares that he has no conflict of interest. Min-Ju Liao declares that she has no conflict of interest. Yuh-Min Chen declares that he has no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. This article does not contain any studies with animals performed by any of the authors. Informed consent was obtained from all individual participants included in the study.

Additional information

Communicated by V. Loia.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chu, HC., Tsai, W.WJ., Liao, MJ. et al. Facial emotion recognition with transition detection for students with high-functioning autism in adaptive e-learning. Soft Comput 22, 2973–2999 (2018). https://doi.org/10.1007/s00500-017-2549-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-017-2549-z

Keywords

Navigation