Speech, Gaze and Head Motion in a Face-to-Face Collaborative Task

  • Sascha Fagel
  • Gérard Bailly
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6456)


In the present work we observe two subjects interacting in a collaborative task on a shared environment. One goal of the experiment is to measure the change in behavior with respect to gaze when one interactant is wearing dark glasses and hence his/her gaze is not visible by the other one. The results show that if one subject wears dark glasses while telling the other subject the position of a certain object, the other subject needs significantly more time to locate and move this object. Hence, eye gaze – when visible – of one subject looking at a certain object speeds up the location of the cube by the other subject. The second goal of the currently ongoing work is to collect data on the multimodal behavior of one of the subjects by means of audio recording, eye gaze and head motion tracking in order to build a model that can be used to control a robot in a comparable scenario in future experiments.


eye gaze eye tracking collaborative task face-to-face interaction head motion capture 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Argyle, M., Cook, M.: Gaze and Mutual gaze. Cambridge University Press, Cambridge (1976)Google Scholar
  2. 2.
    Bailly, G., Raidt, S., Elisei, F.: Gaze, conversational agents and face-to-face communication. Speech Communication 52(3), 598–612 (2010)CrossRefGoogle Scholar
  3. 3.
    Berezm, A.L.: Review of EUDICO Linguistic Annotator (ELAN). Language Documentation & Conservation 1(2) (2007)Google Scholar
  4. 4.
    Boersma, P., Weenink, D.: Praat: doing phonetics by computer (Version 5.1.05) [Computer program] (2009), (retrieved May 1, 2009)
  5. 5.
    Bull, P.E., Brown, R.: Body movement and emphasis in speech. Journal of Nonverbal Behavior 16 (1977)Google Scholar
  6. 6.
    Busso, C., Deng, Z., Neumann, U., Narayanan, S.S.: Natural head motion synthesis driven by acoustic prosodic features. Computer Animation and Virtual Worlds 16(3-4), 283–290 (2005)CrossRefGoogle Scholar
  7. 7.
    Collier, G.: Emotional Expression. Lawrence Erlbaum Associates, Mahwah (1985)Google Scholar
  8. 8.
    Graf, H.P., Cosatto, E., Strom, V., Huang, F.J.: Visual prosody: Facial movements accompanying speech. In: Proceedings of Automatic Face and Gesture Recognition, pp. 396–401 (2002)Google Scholar
  9. 9.
    Hadar, U., Steiner, T.J., Grant, E.C., Clifford Rose, F.: Kinematics of head movements accompanying speech during conversation. Human Movement Science 2, 35–46 (1983)CrossRefGoogle Scholar
  10. 10.
    Heath, C.: Body Movement and Speech in Medical Interaction. Cambridge University Press, Cambridge (2004)Google Scholar
  11. 11.
    Heylen, D.: Head gestures. gaze and the principles of conversational structure. Journal of Humanoid Robotics 3(3), 241–267 (2006)CrossRefGoogle Scholar
  12. 12.
    Hofer, G., Shimodaira, H.: Automatic head motion prediction from speech data. In: Proceedings of Interspeech (2007)Google Scholar
  13. 13.
    Ito, K., Speer, S.R.: Anticipatory effects of intonation: Eye movements during instructed visual search. Journal of Memory and Language 58(2), 541–573 (2008)CrossRefGoogle Scholar
  14. 14.
    Kendon, A.: Gesture and speech: How they interact. In: Wiemann, J.M., Harrison, R.P. (eds.) Nonverbal Interaction, pp. 13–45. Sage Publications, Beverly Hills CA (1983)Google Scholar
  15. 15.
    Lee, J., Marsella, S., Traum, D., Gratch, J., Lance, B.: The Rickel Gaze Model: A window on the mind of a virtual human. In: Pelachaud, C., Martin, J.-C., André, E., Chollet, G., Karpouzis, K., Pelé, D. (eds.) IVA 2007. LNCS (LNAI), vol. 4722, pp. 296–303. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  16. 16.
    Maricchiolo, F., Bonaiuto, M., Gnisci, A.: Hand gestures in speech: Studies of their roles in social interaction. In: Proceedings of the Conference of the International Society for Gesture Studies (2005)Google Scholar
  17. 17.
    McClave, E.Z.: Linguistic functions of head movements in the context of speech. Journal of Pragmatics 32, 855–878 (2000)CrossRefGoogle Scholar
  18. 18.
    Pelachaud, C., Badler, N.I., Steedman, M.: Generating facial expressions for speech. Cognitive Science 20(1), 1–46 (1969)CrossRefGoogle Scholar
  19. 19.
    Sargin, M.E., Yemez, Y., Erzin, E., Tekalp, A.M.: Analysis of head gesture and prosody patterns for prosody-driven head-gesture animation. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(8), 1330–1345 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Sascha Fagel
    • 1
  • Gérard Bailly
    • 1
  1. 1.GIPSA-labGrenobleFrance

Personalised recommendations