Advertisement

Requirements for a Gesture Specification Language

A Comparison of Two Representation Formalisms
  • Alexis Heloir
  • Michael Kipp
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5934)

Abstract

We present a comparative study of two gesture specification languages. Our aim is to derive requirements for a new, optimal specification language that can be used to extend the emerging BML standard. We compare MURML, which has been designed to specify coverbal gestures, and a language we call LV, originally designed to describe French Sign Language utterances. As a first step toward a new gesture specification language we created EMBRScript, a low-level animation language capable of describing multi-channel animations, that can be used as a foundation for future BML extensions.

Keywords

embodied conversational agents gesture description language comparative study 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kopp, S., Krenn, B., Marsella, S., Marshall, A.N., Pelachaud, C., Pirker, H., Thórisson, K.R., Vilhjálmsson, H.: Towards a common framework for multimodal generation: The behavior markup language. In: Gratch, J., Young, M., Aylett, R.S., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS (LNAI), vol. 4133, pp. 205–217. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  2. 2.
    Vilhjalmsson, H., Cantelmo, N., Cassell, J., Chafai, N.E., Kipp, M., Kopp, S., Mancini, M., Marsella, S., Marshall, A.N., Pelachaud, C., Ruttkay, Z., Thórisson, K.R., van Welbergen, H., van der Werf, R.J.: The behavior markup language: Recent developments and challenges. In: Pelachaud, C., Martin, J.-C., André, E., Chollet, G., Karpouzis, K., Pelé, D. (eds.) IVA 2007. LNCS (LNAI), vol. 4722, pp. 99–111. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  3. 3.
    Kranstedt, A., Kopp, S., Wachsmuth, I.: Murml: A multimodal utterance representation markup language for conversational agents. In: Proceedings of the Workshop Embodied Conversational Agents, Autonomous Agents and Multi-Agent Systems (AAMAS 2002), Bologna, Italy (2002)Google Scholar
  4. 4.
    Losson, O., Vannobel, J.M.: Sign specification and synthesis. In: Proc. of Gesture Workshop (1999)Google Scholar
  5. 5.
    Prillwitz, S.L., Leven, R., Zienert, H., Zienert, R., Hanke, T., Henning, J.: HamNoSys. Version 2.0. International Studies on Sign Language and Communication of the Deaf (1989)Google Scholar
  6. 6.
    Liddell, S.: American sign language: The phonological base. Sign language studies 64, 195–278 (1989)Google Scholar
  7. 7.
    Vilhjálmsson, H., Thórisson, K.R.: A brief history of function representation from gandalf to saiba. In: Proceedings of the 1st Function Markup Language Workshop at AAMAS (2008)Google Scholar
  8. 8.
    Lebourque, T., Gibet, S.: High level specification and control of communicative gestures: the gessyca system. In: Proceedings of Computer Animation, May 26-29, pp. 24–35 (1999)Google Scholar
  9. 9.
    Kennaway, R.: Synthetic animation of deaf signing gestures. In: Wachsmuth, I., Sowa, T. (eds.) GW 2001. LNCS (LNAI), vol. 2298, pp. 146–157. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  10. 10.
    Stokoe, W.: Sign language structure: An outline of the visual communication systems of the american deaf. In: Studies in Linguistic, Occasional Papers, vol. 8 (1960)Google Scholar
  11. 11.
    Kendon, A.: Some relationships between body motion and speech: an analysis of an example. In: Siegman, A., Pope, B. (eds.) Studies in Dyadic Communication, pp. 177–210. Pergamon Press, New York (1972)Google Scholar
  12. 12.
    Huenerfauth, M.: Representing coordination and non-coordination in an american sign language animation. In: Assets 2005: Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility, pp. 44–51. ACM, New York (2005)CrossRefGoogle Scholar
  13. 13.
    Cassell, J., Vilhjalmsson, H., Bickmore, T.: BEAT: The behavior expression animation toolkit. In: Fiume, E. (ed.) SIGGRAPH 2001, Computer Graphics Proceedings, pp. 477–486 (2001)Google Scholar
  14. 14.
    Stone, M., DeCarlo, D., Oh, I., Rodriguez, C., Stere, A., Less, A., Bregler, C.: Speaking with hands: Creating animated conversational characters from recordings of human performance. In: Proc. ACM/EUROGRAPHICS Symposium on Computer Animation (2004)Google Scholar
  15. 15.
    Kipp, M., Neff, M., Kipp, K.H., Albrecht, I.: Toward Natural Gesture Synthesis: Evaluating gesture units in a data-driven approach. In: Pelachaud, C., Martin, J.-C., André, E., Chollet, G., Karpouzis, K., Pelé, D. (eds.) IVA 2007. LNCS (LNAI), vol. 4722, pp. 15–28. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  16. 16.
    Hartmann, B., Mancini, M., Pelachaud, C.: Implementing Expressive Gesture Synthesis for Embodied Conversational Agents. In: Gesture in Human-Computer Interaction and Simulation (2006)Google Scholar
  17. 17.
    Heloir, A., Kipp, M.: EMBR – A realtime animation engine for interactive embodied agents. In: Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds.) IVA 2009. LNCS, vol. 5773, pp. 393–404. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Alexis Heloir
    • 1
  • Michael Kipp
    • 1
  1. 1.DFKIEmbodied Agents Research GroupSaarbrückenGermany

Personalised recommendations