Using Facial Emotional Signals for Communication between Emotionally Expressive Avatars in Virtual Worlds

  • Yuqiong Wang
  • Joe Geigel
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6975)


In this paper we explore the applications of facial expression analysis and eye tracking in driving emotionally expressive avatars. We propose a system that transfers facial emotional signals including facial expressions and eye movements from the real world into a virtual world. The proposed system enables us to address the questions: How significant are eye movements in emotion expression? Can facial emotional signals be transferred effectively, from the real world into virtual worlds? We design an experiment to address the questions. There are two major contributions of our work: 1) We propose a system that incorporates eye movements for transferring facial emotions; 2) We design an experiment to evaluate the effectiveness of the facial emotional signals.


facial expression analysis facial emotional signal eye gaze avatar expression virtual world 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Alexander, O., Rogers, M., Lambeth, W., Chiang, M., Debevec, P.: The digital emily project: photoreal facial modeling and animation. In: ACM SIGGRAPH 2009 Courses, SIGGRAPH 2009, pp. 12:1–12:15. ACM, New York (2009), Scholar
  2. 2.
    Argyle, M., Lefebvre, L., Cook, M.: The meaning of five patterns of gaze. European Journal of Social Psychology 4(2), 125–136 (1974), Scholar
  3. 3.
    Banse, R., Scherer, K.R.: Acoustic profiles in vocal emotion expression, vol. 70, pp. 614–636 (1996)Google Scholar
  4. 4.
    Bulling, A., Ward, J., Gellersen, H., Troster, G.: Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence PP(99), 1 (2011)Google Scholar
  5. 5.
    Cowie, R., Douglas-Cowie, E.: Automatic statistical analysis of the signal and prosodic signs of emotion in speech. In: Proceedings of the Fourth International Conference on Spoken Language, ICSLP 1996, vol. 3, pp. 1989–1992 (1996)Google Scholar
  6. 6.
    Ekman, P., Friesen, W.V.: Facial action coding system. Consulting Psychologists Press (1978)Google Scholar
  7. 7.
    Ekman, P., Friesen, W.V., O’Sullivan, M., Chan, A., Diacoyanni-Tarlatzis, I., Heider, K., Krause, R., LeCompte, W.A., Pitcairn, T., Ricci-Bitti, P.E., Scherer, K., Tomita, M., Tzavaras, A.: Universals and cultural differences in the judgments of facial expressions of emotion. Journal of Personality and Social Psychology 53(4), 712–717 (1987)CrossRefGoogle Scholar
  8. 8.
    Fasel, B., Luettin, J.: Automatic facial expression analysis: a survey. Pattern Recognition 36, 259–275 (2003)CrossRefzbMATHGoogle Scholar
  9. 9.
    Gemmell, J., Toyama, K., Zitnick, C., Kang, T., Seitz, S.: Gaze awareness for video-conferencing: a software approach. IEEE Multimedia 7(4), 26–35 (2000)CrossRefGoogle Scholar
  10. 10.
    Heylen, D.K.J.: A closer look at gaze. In: Pelachaud, C., André, E., Kopp, S., Ruttkay, Z.M. (eds.) Creating Bonds with Embodied Conversational Agents, pp. 3–9. University of Utrecht (2005)Google Scholar
  11. 11.
    Russell, J.A.: The psychology of facial expression. Cambridge University Press, Cambridge (1997)Google Scholar
  12. 12.
    Kendon, A.: Some functions of gaze-direction in social interaction. Acta Psychologica 26(1), 22–63 (1967), Scholar
  13. 13.
    Argyle, M., Eye-contact, D.J., Argyle, M.: Citation index (sci)and the social sciences citation index (ssci) indicate (1978)Google Scholar
  14. 14.
    MacDorman, K.F.: Subjective ratings of robot video clips for human likeness, familiarity, and eeriness: An exploration of the uncanny valley. In: ICCSCogSci 2006 Long Symposium Toward Social Mechanisms of Android Science, p. 4 (2005)Google Scholar
  15. 15.
    NDiaye, K., Sander, D., Vuilleumier, P.: Self-relevance processing in the human amygdala: Gaze direction, facial expression, and emotion intensity. Emotion 9(6), 798–806 (2009)CrossRefGoogle Scholar
  16. 16.
    Pakstas, A.: MPEG-4 Facial Animation: The Standard,Implementation and Applications. John Wiley & Sons, Inc., New York (2002)Google Scholar
  17. 17.
    Sagar, M.: Facial performance capture and expressive translation for king kong. In: SIGGRAPH 2006: ACM SIGGRAPH 2006 Sketches, p. 26. ACM, New York (2006)Google Scholar
  18. 18.
    Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, ETRA 2000, pp. 71–78. ACM, New York (2000), Scholar
  19. 19.
    Shang, J., Liu, Y., Fu, X.: Dominance modulates the effects of eye gaze on the perception of threatening facial expressions. In: 8th IEEE International Conference on Automatic Face Gesture Recognition, FG 2008, pp. 1–6 (2008)Google Scholar
  20. 20.
    Tian, Y.L., Kanade, T., Cohn, J.: Facial expression analysis. In: Li, S., Jain, A. (eds.) Handbook of Face Recognition. Springer, Heidelberg (2003)Google Scholar
  21. 21.
    Walker, M.A., Cahn, J.E., Whittaker, S.J.: Improvising linguistic style: Social and affective bases for agent personality, pp. 96–105. ACM Press, New York (1997)Google Scholar
  22. 22.
    Weise, T., Bouaziz, S., Li, H., Pauly, M.: Realtime performance-based facial animation. ACM Transactions on Graphics (Proceedings SIGGRAPH 2011) (August 2011)Google Scholar
  23. 23.
    Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(1), 39–58 (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Yuqiong Wang
    • 1
  • Joe Geigel
    • 1
  1. 1.Rochester Institute of TechnologyRochesterUSA

Personalised recommendations