Detecting Happiness in Human Face Using Minimal Feature Vectors
Human emotions estimated from face become more effective compared to various modes of extracting emotion owing to its robustness, high accuracy and better efficiency. This paper proposes detecting happiness of human face using minimal facial features from geometric deformable model and supervised classifier. First, the face detection and tracking is observed by constrained local model (CLM). Using CLM grid node, the entire and minimal feature vectors displacement is obtained by facial feature extraction. Compared to entire features, minimal feature vectors is considered for detecting happiness to improve accuracy. Facial animation parameters (FAPs) helps in identifying the facial feature movements to forms the feature vectors displacement. The feature vectors displacement is computed in supervised bilinear support vector machines (SVMs) classifier to detect the happiness in human frontal face image sequences. This paper focuses on minimal feature vectors of happiness (frontal face) in both training and testing phases. MMI facial expression database is used in training, and real-time data are used for testing phases. As a result, the overall accuracy of happiness is achieved 91.66% using minimal feature vectors.
KeywordsConstrained local model (CLM) Facial animation parameters (FAPs) Minimal feature vectors displacement Support vector machines (SVMs)
The authors would like to thanks my research colleague for real-time dataset from Vellore Institute of Technology, Chennai.
- 1.Kollias S, Karpouzis K (2005) Multimodal emotion recognition and expressivity analysis. In: 2005 IEEE international conference on multimedia and expo. IEEE, pp 779–783Google Scholar
- 3.Mase K (1991) Recognition of facial expression from optical flow. IEICE Trans Inf Syst 74(10):3474–3483Google Scholar
- 5.Bartlett MS et al (2003) Real time face detection and facial expression recognition: development and applications to human computer interaction. In: Conference on computer vision and pattern recognition workshop (CVPRW’03), vol 5. IEEE, pp 53–53Google Scholar
- 6.Lucey P et al (2010) The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE computer society conference on computer vision and pattern recognition-workshops. IEEE, pp 94–101Google Scholar
- 13.Salam H (2013) Multi-object modelling of the face. Ph.D. thesis, SupelecGoogle Scholar
- 15.Ventura D (2009) SVM example. Lectures notes, Mar 2009Google Scholar
- 17.Cristinacce D, Cootes TF (2006) Feature detection and tracking with constrained local models. In: BMVC, vol 1, p 3Google Scholar
- 19.Valstar M, Pantic M (2010) Induced disgust, happiness and surprise: an addition to the MMI facial expression database. In Proceedings of the 3rd international workshop on EMOTION (satellite of LREC): Corpora for research on emotion and affect, p 65Google Scholar