Advertisement

Introducing a Language for Human Stance Description

  • António C. Mabiala
  • Antonio W. Sousa
  • Norton T. RomanEmail author
  • João L. BernardesJr.
  • Marcelo M. Antunes
  • Enrique M. Ortega
  • Luciano A. Digiampietri
  • Luis M. del Val Cura
  • Valdinei F. da Silva
  • Clodoaldo A. M. Lima
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9745)

Abstract

The ability to automatically determine a human body’s sequence of postures during movement has many practical applications, from the evaluation of the performance of physical activity practitioners to the evaluation and design of the user experience in certain systems. Current representations for such postures, however useful, are not capable of capturing all necessary features for a complete description of a human stance, such as the relationship between non-directly connected body parts. In this article, we introduce a mark-up language for body stance and movement description, designed to allow for the unambiguous representation of movement as well as the extraction of relationships between directly and non-directly connected body parts. Along with the language, we also present a computer program, developed to help end users in codifying stance and movement without having to know the language in detail.

Keywords

Body Part Human Posture Facial Animation Connected Part Movement Description 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Alexiadis, D.S., Kelly, P., Daras, P., O’Connor, N.E., Boubekeur, T., Moussa, M.B.: Evaluating a dancer’s performance using kinect-based skeleton tracking. In: Candan, K.S., Panchanathan, S., Prabhakaran, B., Sundaram, H., Feng, W.C., Sebe, N. (eds.) the 19th ACM International Conference, pp. 659–622 (2011)Google Scholar
  2. 2.
    Bordegoni, M., Camere, S., Caruso, G., Cugini, U.: Body tracking as a generative tool for experience design. In: Duffy, V.G. (ed.) DHM 2015. LNCS, vol. 9185, pp. 122–133. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  3. 3.
    Chang, C.Y., Lange, B., Zhang, M., Koenig, S., Requejo, P., Somboon, N., Sawchuk, A., Rizzo, A.: Towards pervasive physical rehabilitation using microsoft kinect. In: Arriaga, R., Matic, A. (eds.) 6th International Conference on Pervasive Computing Technologies for Healthcare (2012)Google Scholar
  4. 4.
    Chang, Y.J., Chen, S.F., Huang, J.D.: A kinect-based system for physical rehabilitation: a pilot study for young adults with motor disabilities. Res. Dev. Disabil. 32(6), 2566–2570 (2011)CrossRefGoogle Scholar
  5. 5.
    Chye, C., Sakamoto, M., Nakajima, T.: An exergame for encouraging martial arts. In: Kurosu, M. (ed.) HCI 2014, Part III. LNCS, vol. 8512, pp. 221–232. Springer, Heidelberg (2014)Google Scholar
  6. 6.
    Clark, R.A., Pua, Y.H., Fortin, K., Ritchie, C., Webster, K.E., Denehy, L., Bryant, A.L.: Validity of the microsoft kinect for assessment of postural control. Gait Posture 36(3), 372–377 (2012)CrossRefGoogle Scholar
  7. 7.
    Eshkol, N., Harries, J.G., Sapir, T., Sella, R., Shoshani, M.: The Quest for Tai Chi Chuan. The Research Center for Movement Notation, Faculty of Visualand Performing Arts, Tel Aviv University (1986)Google Scholar
  8. 8.
    Guest, A.H.: Labanotation: The System of Analyzing and Recording Movement. Taylor & Francis, London (2014)Google Scholar
  9. 9.
    Hanke, T.: Hamnosys - representing sign language data in language resources and language processing contexts. In: Proceedings of the 4th International Conference on Language Resources and Evaluation (LREC 204). Lisbon, Portugal, 26–28 May 2004Google Scholar
  10. 10.
    Huang, Z., Eliëns, A., Visser, C.: Step: a scripting language for embodied agents. In: Proceedings of the Workshop of Lifelike Animated Agents, Tokyo (2002)Google Scholar
  11. 11.
    Huang, Z., Eliëns, A., Visser, C.: XSTEP: An XML-based markup language for embodied agents. In: Proceedings of the 16th International Conference on Computer Animation and Social Agents, (CASA 2003), pp. 105–110 (2003)Google Scholar
  12. 12.
    Huang, Z., Eliëns, A., Visser, C.: Step: a scripting language for embodied agents. In: Gabbay, M., Siekmann, J., Prendinger, H., Ishizuka, M. (eds.) Life-Like Characters. Cognitive Technologies, pp. 87–109. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  13. 13.
    Kennaway, R.: Avatar-independent scripting for real-time gesture animation. arXiv preprint (2015). arXiv:1502.02961
  14. 14.
    Kleinman, S.: Movement notation systems: An introduction. Quest 23(1), 33–34 (1975)CrossRefGoogle Scholar
  15. 15.
    Rett, J., Dias, J., Ahuactzin, J.M.: Laban movement analysis using a bayesian model and perspective projections. In: Proceedings of the Brain, Vision and AI (2008)Google Scholar
  16. 16.
    Trajkova, M., Ferati, M.: Usability evaluation of kinect-based system for ballet movements. In: Marcus, A. (ed.) DUXU 2015. LNCS, vol. 9187, pp. 464–472. Springer, Heidelberg (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • António C. Mabiala
    • 1
  • Antonio W. Sousa
    • 2
  • Norton T. Roman
    • 1
    Email author
  • João L. BernardesJr.
    • 1
  • Marcelo M. Antunes
    • 3
  • Enrique M. Ortega
    • 3
  • Luciano A. Digiampietri
    • 1
  • Luis M. del Val Cura
    • 4
  • Valdinei F. da Silva
    • 1
  • Clodoaldo A. M. Lima
    • 1
  1. 1.University of São PauloSão PauloBrazil
  2. 2.São Paulo Faculty of TechnologySão PauloBrazil
  3. 3.Central Kung Fu AcademyCampinasBrazil
  4. 4.Campo Limpo Paulista FacultyCampo Limpo PaulistaBrazil

Personalised recommendations