Advertisement

A Proposed Dynamical Analytic Method for Characteristic Gestures in Human Communication

  • Toshiya NakaEmail author
  • Toru Ishida
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9176)

Abstract

In human communication, nonverbal information such as gestures and facial expressions often plays a greater role than language; and some gesture-driven operations of the latest mobile devices have proved to be easy-to-use and intuitive interfaces. In this paper we propose a method of analyzing gestures that focuses on human communication based on the dynamical kinematic model. We have extended the analysis method of our proposed approach to take into account additional effects, such as those exerted by external forces, and we analyze the effects over the entire body of forces generated by gestures. We found that the degree of exaggeration could be quantified by the value of, and changes in, torque values. Moreover, when calculating them taking into account external forces and the moment of drag that is acting on both feet, it is possible to determine the twisting torque of the main joints with a high degree of precision. We also noted “preparation” or “follow-through” motions just before and after the emphasized motion, and found that each behavior can be quantified by an “undershoot” or “overshoot” value of changes in torque.

Keywords

Nonverbal Communication Gesture Virtual reality Dynamics 

References

  1. 1.
    Mehrabian, A.: Silent Messages: Implicit Communication of Emotions and Attitudes, 2nd edn. Wadsworth, Belmont (1981)Google Scholar
  2. 2.
    Naka, T., Ishida, T.: Proposal of the effective method of generating characteristic gestures in nonverbal communication. HCI 2, 102–112 (2014)Google Scholar
  3. 3.
    Uno, Y., Kawato, M., Suzuki, R.: Formation and control of optimal trajectory in human multijoint arm movement-minimum torque-change model. Biol. Cybern. 61, 89–101 (1989)CrossRefGoogle Scholar
  4. 4.
    Putnam, C.: Sequential motions of body segments in striking and throwing skills: descriptions and explanations. J. Biomech. 26(1), 125–135 (1993)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Badler, N., Allbeck, J., Zhao, L., Byun, M.: Representing and parameterizing agent behavior. In: Proceedings of Computer Animation, pp. 133–143. IEEE Computer Society (2002)Google Scholar
  6. 6.
    Kanda, T., Miyashita, T., Osada, T., Haikawa, Y., Ishiguro, H.: Analysis of humanoid appearances in human-robot interaction. IEEE Trans. Rob. 24(3), 725–735 (2008)CrossRefGoogle Scholar
  7. 7.
    Cassell, J., Pelachaud, C., Badler, N., Steedman, M., Achorn, B., Becket, T., Douville, B., Prevost, S., Stone, M.: Animated conversation: rule-based generation of facial expression, gesture and spoken intonation for multiple conversational agents. In: ACM SIGGRAPH (1994)Google Scholar
  8. 8.
    Mochizuki, Y., Inokuchi, S., Omura, K.: Generating artificially mastered motions for an upper limb in baseball pitching from several objective functions. IEEE Trans. Syst. Man Cybern. B Cybern. 30(3), 373–382 (2000)CrossRefGoogle Scholar
  9. 9.
    Kudoh, S., Komura, T., Ikeuchi, K.: Stepping motion for a human-like character to maintain balance against large perturbations. In: IEEE International Conference on Robotics and Automation (2006)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Kyoto UniversityKyotoJapan
  2. 2.Panasonic Advanced Research LabKyotoJapan

Personalised recommendations