Advertisement

Action Unit-Based Linked Data for Facial Emotion Recognition

  • Kosuke Kaneko
  • Yoshihiro Okada
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8210)

Abstract

This paper treats methodology to build linked data from the relationships between facial action units and their states as emotional parameters for the facial emotion recognition. In this paper, the authors are especially focusing on building action unit-based linked data because it will be possible not only to use the data for the facial emotion recognition but also to enhance the usefulness of the data by merging them with other linked data. Although in general, the representation as linked data seems to make the accuracy of the facial emotion recognition lower than others, in practically the proposed method that uses action unit-based linked data has almost the same accuracy for the facial emotion recognition as those of other approaches like using Artificial Neural Network and using Support Vector Machine.

Keywords

Linked Data Semantic Data Facial Emotion Recognition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
  3. 3.
    Strapparava, C., Valitutti, A.: WordNet-Affect: an Affective Extension of WordNet. In: Proceedings of the 4th International Conference on Language Resources and Evaluation, vol. 4, pp. 1083–1086 (2004)Google Scholar
  4. 4.
    Ptaszynski, M., Rzepka, R., Araki, K., Momouchi, Y.: A Robust Ontology of Emotion Objects. In: Proceedings of the Eighteenth Annual Meeting of The Association for Natural Language Processing (NLP 2012), pp. 719–722 (2012)Google Scholar
  5. 5.
  6. 6.
    Grassi, M.: Developing HEO human emotions ontology. In: BioID_MultiComm 2009 Proceedings of the 2009 Joint COST 2101 and 2102 International Conference on Biometric ID Management and Multimodal Communication, pp. 244–251 (2009)Google Scholar
  7. 7.
    García-Rojas, A., Vexo, F., Thalmann, D., Raouzaiou, A., Karpouzis, K., Kollias, S.: Emotional Body Expression Parameters In Virtual Human Ontology. In: Proceedings of 1st International Workshop on Shapes and Semantics, pp. 63–70 (2006)Google Scholar
  8. 8.
    Benta, K., Rarau, A., Cremene, M.: Ontology Based Affective Context Representation. In: Proceedings of the 2007 Euro American Conference on Telematics and Information Systems (EATIS 2007), Article No. 46 (2007)Google Scholar
  9. 9.
    Tsapatsoulis, N., Karpouzis, K., Stamou, G., Piat, F., Kollias, S.A.: A Fuzzy System for Emotion Classification Based on the MPEG-4 Facial Definition Parameter Set. In: Proceedings of the 10th European Signal Processing Conference (2000)Google Scholar
  10. 10.
    Tekalp, A.M., Ostermann, J.: Face and 2-D Mesh Animation in MPEG-4. Signal Processing: Image Communication 15(4), 387–421 (2000)CrossRefGoogle Scholar
  11. 11.
    Azcarate, A., Hageloh, F., van de Sande, K., Valenti, R.: Automatic Facial Emotion Recognition, Univerity of Amsterdam (2005)Google Scholar
  12. 12.
    Bui, T.D., Heylen, D., Poe, M., Nijholt, A.: Generation of facial expressions from Emotion using a Fuzzy Rule Based System. In: Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence (AI 2001), pp. 83–94 (2001)Google Scholar
  13. 13.
    Kozasa, C., Fukutake, H., Notsu, H., Okada, Y., Niijima, K.: Facial Animation Using Emotional Model. In: International Conference on Computer Graphics, Imaging and Visualisation (CGIV 2006), pp. 428–433 (2006)Google Scholar
  14. 14.
    García-Rojas, A., Vexo, F., Thalmann, D., Raouzaiou, A., Karpouzis, K., Kollias, S., Moccozet, L., Thalmann, N.M.: Emotional face expression profiles supported by virtual human ontology. In: Computer Animation and Virtual Worlds (CASA 2006), vol. 17(3-4), pp. 259–269 (2006)Google Scholar
  15. 15.
    Raouzaiou, A., Tsapatsoulis, N., Karpouzis, K., Kollias, S.: Parameterized facial expression synthesis based on MPEG-4. EURASIP Journal on Applied Signal Processing archive 2002(1), 1021–1038 (2002)CrossRefzbMATHGoogle Scholar
  16. 16.
  17. 17.
    Ekman, P., Friesen, W.V., Ellsworth, P.: What emotion categories or dimensions can observers judge from facial behavior? In: Ekman, P. (ed.) Emotion in the Human Face, pp. 39–55 (1982)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Kosuke Kaneko
    • 1
  • Yoshihiro Okada
    • 1
  1. 1.Innovation Center for Educational ResourceKyushu University LibraryFukuokaJapan

Personalised recommendations