Skip to main content
Log in

RETRACTED ARTICLE: Real-time facial expression recognition for affect identification using multi-dimensional SVM

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

This article was retracted on 01 June 2022

This article has been updated

Abstract

Automated detection of different affect states for human beings using facial expressions has attracted an increasing level of research attention for high accuracy. In general, existing facial expressions databases have been used in automated affect state detection to achieve better efficiency. However, minimal real-time experiments data has been used for detecting affect state. To efficiently automate the affect recognition process, in this work, we proposed a new classifier framework and acquired experimentally real-time dataset for training and testing phase. The participants’ face images are captured and processed on the Region of Interest (ROI) to determine the feature points, and hence feature vectors are obtained. The facial feature vectors are given as an input to the proposed multiclass classifier, Multi-dimensional Support Vector Machine (MDSVM), which efficiently identifies and classifies the different affect state. The MDSVM is designed multi-dimensional, based on the two-dimensional valence-arousal emotional model. Thus, two SVMs are used, one in level-1, which classifies the input dataset into two classes based on valence, similarly the second SVM in level-2 classifies the input dataset into two categories based on arousal. The efficiency is improved by using the 8-cross-validation method. Thus, the methodology proposed in this work combines the advantage of reliable dataset acquired experimentally, significant 12 feature vectors obtained from facial expressions, prominently active multidimensional SVM, and 8-cross validation method. Experimental results illustrate the efficacy of the proposed system for accurately classifying the affect state into four categories, Relax, Sad, Angry, and Happy. The average accuracy obtained is 94.25% without k-fold cross-validation and 95.88% with 8-fold cross-validation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Change history

References

  • Anderson K, McOwan PW (2006) A real-time automated system for the recognition of human facial expression. IEEE Trans Syst Man Cybern Part B (Cybernetics) 36(1):96–105

    Article  Google Scholar 

  • Barry RJ, Clarke AR, Johnstone SJ, Magee CA, Rubhy JA (2007) EEG differences between eyes-closed and eyes-open resting conditions. Clin Neurophysiol 118(12):2765–2773

    Article  Google Scholar 

  • Barry RJ, Clarke AR, Johnstone SJ, Magee CA, Brown CR (2009) EEG differences in children between eyes-closed and eyes-open resting condition. Clin Neurophysiol 120(10):1806–1811

    Article  Google Scholar 

  • Bavkar SS, Rangole JS, Deshmukh VU (2015) Geometric approach for human emotion recognition using facial expression. Int J Comput Appl 118(14):17–22

    Google Scholar 

  • Bradley MM, Lang PJ (1994) Measuring emotion: the Self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  • Burges CJC (1998) A tutorial on support vector machine for pattern recognition. Data Min Knowl Disc 2(2):121–167

    Article  Google Scholar 

  • Davidson RJ (2003) Affective neuroscience and psychophysiology: toward a synthesis. Psychophysiology 40(5):655–665

    Article  Google Scholar 

  • Friedman JH (1997) Another approach to polychotomous classification. Technical report—Department of Statistics, Stanford University

  • Ghimire D, Lee J (2013) Geometric feature-based facial expression recognition in image sequences using multi-class adaboost and support vector machines. Sensors 13(6):7714–7734

    Article  Google Scholar 

  • Haag A, Goronzy S, Schaich P, Williams J (2004) Emotion recognition using bio-sensors: First steps towards an automatic system. In: Tutorial and research workshop on affective dialogue systems, pp 36–48

  • Hossin M, Sulaiman MN (2015) A Review on evaluation metrics for data classification evaluations. Int J Data Min Knowl Manag Process 5(2):1–11

    Article  Google Scholar 

  • Katsis CD, Katertsidis NS, Fotiadis DI (2011) An integrated system based on physiological signals for the assessment of affective states in patients with anxiety disorders. Biomed Signal Process Control 6(3):261–268

    Article  Google Scholar 

  • Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Patras I (2011) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31

    Article  Google Scholar 

  • Lang PJ, Greenwald MK, Bradley M, Hamm AO (1993) Looking at pictures: affective, facial, visceral and behavioral reactions. Psychophysiology 30(3):261–273

    Article  Google Scholar 

  • Lang PJ, Bradley MM, Cuthbert BN (1999) International Affective Picture System (IAPS): technical manual and affective Ratings. The Center for Research in Psychophysiology, University of Florida, G

    Google Scholar 

  • Lee K, Lee EC (2019) Comparison of facial expression recognition performance according to the use of depth information of structured-light type RGB-D camera. J Ambient Intell Hum Comput, pp 1–17

  • Lozano-Monasor E, López MT, Vigo-Bustos F, Fernández-Caballero A (2017) Facial expression recognition in ageing adults: from lab to ambient assisted living. J Ambient Intell Hum Comput 8(4):567–578

    Article  Google Scholar 

  • Lupien SJ, Maheu F, Tu M, Fiocco A, Schramek TE (2007) The effects of stress and stress hormones on human cognition: implications for the field of brain and cognition. Brain Cogn 65(3):209–237

    Article  Google Scholar 

  • Morris JD (1995) Observations: sam: The Self-assessment Manikin: an efficient cross-cultural measurement of emotional response. J Advert Res 35(6):63–68

    Google Scholar 

  • Parrott WG (2001) Emotions in social psychology: essential readings. Psychology Press, Philadelphia

    Google Scholar 

  • Paul A, Ahmad A, Rathore MM, Jabbar S (2016) Smartbuddy: defining human behaviors using big data analytics in social internet of things. IEEE Wirel Commun 23(5):68–74

    Article  Google Scholar 

  • Philippot P, Chapelle G, Blairy S (2002) Respiratory feedback in the generation of emotion. Cogn Emot 16(5):605–627

    Article  Google Scholar 

  • Rainville P, Bechara A, Naqvi N, Damasio AR (2006) Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int J Psychophysiol 61(1):5–18

    Article  Google Scholar 

  • Revina IM, Emmanuel WS (2018) A survey on human face expression recognition techniques. J King Saud Univ Comput Inf Sci. https://doi.org/10.1016/j.jksuci.2018.09.002

    Article  Google Scholar 

  • Soleymani M, Pantic M (2013) Multimedia implicit tagging using EEG signals. In: IEEE International Conference on multimedia and expo, p 1–6

  • Song P, Zheng W (2018) Feature selection based transfer subspace learning for speech emotion recognition. In: IEEE Transactions on Affective Computing, p 1–11

  • Uhrig MK, Trautmann N, Baumgärtner U, Treede RD, Henrich F, Hiller W, Marschall S (2016) Emotion elicitation: a comparison of pictures and films. Front Psychol 7(180):1–12

    Google Scholar 

  • Vapnik V (1999) The nature of statistical learning theory. Springer, New York

    MATH  Google Scholar 

  • Weston J, Watkins (1999) Multi-class support vector machines. In: Proceedings of ESAN99, Brussels, Belgium

  • Wu G, Liu G, Hao M (2010) The analysis of emotion recognition from GSR based on PSO. In: International Symposium on Intelligence information processing and trusted computing, pp. 360-363

  • Zhang T, Zheng W, Cui Z, Zong Y, Yan J, Yan K (2016) A deep neural network driven feature learning method for multi-view facial expression recognition. IEEE Trans Multim 18(12):2528–2536

    Article  Google Scholar 

  • Zhao B, Wang Z, Yu Z, Guo B (2018) Emotion sense: emotion recognition based on wearable wristband. In: Proceedings of the 2018 IEEE Smart World, Ubiquitous Intelligence Computing, advanced trusted computing, scalable computing communications, cloud big data computing, internet of people and smart city innovation, Guangzhou, China, 8(12): 346–355

  • Zheng W, Zhou X, Zou C, Zhao L (2006) Facial expression recognition using kernel canonical correlation analysis. IEEE Trans Neural Netw 17(1):233–238

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to W. Thamba Meshach.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article has been retracted. Please see the retraction notice for more detail: https://doi.org/10.1007/s12652-022-04015-4

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Meshach, W.T., Hemajothi, S. & Anita, E.A.M. RETRACTED ARTICLE: Real-time facial expression recognition for affect identification using multi-dimensional SVM. J Ambient Intell Human Comput 12, 6355–6365 (2021). https://doi.org/10.1007/s12652-020-02221-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-020-02221-6

Keywords

Navigation