Advertisement

Autonomous Robots

, Volume 42, Issue 3, pp 569–580 | Cite as

Generic method for generating blended gestures and affective functional behaviors for social robots

  • Greet Van de Perre
  • Hoang-Long Cao
  • Albert De Beir
  • Pablo Gómez Esteban
  • Dirk Lefeber
  • Bram Vanderborght
Article

Abstract

Gesturing is an important modality in human–robot interaction. Up to date, gestures are often implemented for a specific robot configuration and therefore not easily transferable to other robots. To cope with this issue, we presented a generic method to calculate gestures for social robots. The method was designed to work in two modes to allow the calculation of different types of gestures. In this paper, we present the new developments of the method. We discuss how the two working modes can be combined to generate blended emotional expressions and deictic gestures. In certain situations, it is desirable to express an emotional condition through an ongoing functional behavior. Therefore, we implemented the possibility of modulating a pointing or reaching gesture into an affective gesture by influencing the motion speed and amplitude of the posture. The new implementations were validated on virtual models with different configurations, including those of the robots NAO and Justin.

Keywords

Generic gesture system Pointing Gestures Upper body postures Affective gesture 

Notes

Acknowledgements

The first author is funded by the Fund for Scientific Research (FWO) Flanders. This work is partially funded by the EU-Project DREAM (611391). Robotics and Multibody Mechanics Research Group is partner of the Agile and Human Centered Production and Robotic Systems Research Priority of Flanders Make. The authors would like to thank DLR for sharing the virtual model of Justin.

References

  1. Alissandrakis, A., Nehaniv, C. L., & Dautenhahn, K. (2002). Imitation with alice: Learning to imitate corresponding actions across dissimilar embodiments. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 32(4), 482–496.CrossRefGoogle Scholar
  2. Amaya, K., Bruderlin, A., & Calvert, T. (1996). Emotion from motion. Graphics Interface, 96, 222–229.Google Scholar
  3. Andry, P., Gaussier, P., Moga, S., Banquet, J., & Nadel, J. (2001). Learning and communication via imitation: An autonomous robot perspective. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 31, 431–442.CrossRefGoogle Scholar
  4. Ascher, U. M., & Petzold, L. R. (1998). Computer methods for ordinary differential equations and differential-algebraic equations. Philadelphia: Siam.CrossRefMATHGoogle Scholar
  5. Atkinson, A. P., Dittrich, W. H., Gemmell, A. J., Young, A. W., et al. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33, 717–746.CrossRefGoogle Scholar
  6. Azad, P., Asfour, T., & Dillmann, R. (2007). Toward an unified representation for imitation of human motion on humanoids. In: IEEE International Conference on Robotics and Automation, (pp. 2558–2563), IEEE.Google Scholar
  7. Balit, E., Vaufreydaz, D., & Reignier, P. (2016). Integrating animation artists into the animation design of social robots. In: ACM/IEEE Human–Robot Interaction 2016 (pp 417–418)Google Scholar
  8. Belpaeme, T., Baxter, P. E., Read, R., Wood, R., Cuayáhuitl, H., Kiefer, B., et al. (2012). Multimodal child-robot interaction: Building social bonds. Journal of Human–Robot Interaction, 1(2), 33–53.Google Scholar
  9. Breazeal, C., Kidd, C. D., Thomaz, A. L., Hoffman, G., & Berlin, M. (2005). Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2005), (pp 708–713).Google Scholar
  10. Castellano, G., Villalba, S. D., & Camurri, A. (2007). Recognising human emotions from body movement and gesture dynamics. In: A.C.R. Paiva, R. Prada, R.W. Picard (Eds.), Affective computing and intelligent interaction (pp .71–82), Springer.Google Scholar
  11. Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior, 28(2), 117–139.MathSciNetCrossRefGoogle Scholar
  12. Crane, E., & Gross, M. (2007), Motion capture and emotion: Affect detection in whole body movement. In: A.C.R. Paiva, R. Prada, R.W. Picard (Eds.), Affective computing and intelligent interaction (pp. 95–101), Springer.Google Scholar
  13. Dautenhahn, K., & Nehaniv, C. L. (2002). The correspondence problem. Cambridge: MIT Press.MATHGoogle Scholar
  14. De Meijer, M. (1989). The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13(4), 247–268.CrossRefGoogle Scholar
  15. Dittrich, W. H., Troscianko, T., Lea, S. E., & Morgan, D. (1996). Perception of emotion from dynamic point-light displays represented in dance. Perception, 25(6), 727–738.CrossRefGoogle Scholar
  16. Do, M., Azad, P., Asfour, T., & Dillmann, R. (2008). Imitation of human motion on a humanoid robot using non-linear optimization. 8th IEEE-RAS International Conference on Humanoid Robots, Humanoid (pp. 545–552).Google Scholar
  17. Gienger M., Janssen H., & Goerick C. (2005). Task-oriented whole body motion for humanoid robots. In: 2005 5th IEEE-RAS International Conference on Humanoid Robots (pp. 238–244), IEEE.Google Scholar
  18. Hild, M., Siedel, T., Benckendorff, C., Thiele, C., & Spranger, M. (2012). Myon, a new humanoid. In: L. Steels, M. Hild (Eds.), Language grounding in robots (pp. 25–44), Springer.Google Scholar
  19. Hirukawaa, H., Kanehiroa, F., Kanekoa, K., Kajitaa, S., Fujiwaraa, K., Kawaia, Y., et al. (2004). Humanoid robotics platforms developed in HRP. Robotics and Autonomous Systems, 48(4), 165–175.CrossRefGoogle Scholar
  20. Ido, J., Matsumoto, Y., Ogasawara, T., & Nisimura, R. (2006). Humanoid with interaction ability using vision and speech information. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2006) (pp 1316–1321).Google Scholar
  21. Itoh, K., Miwa, H., Matsumoto, M., Zecca, M., Takanobu, H., Roccella S., et al. (2004). Various emotional expressions with emotion expression humanoid robot WE-4RII. In: IEEE Technical Exhibition Based Conference on Robotics and Automation (pp. 35–36).Google Scholar
  22. James, W. T. (1932). A study of the expression of bodily posture. The Journal of General Psychology, 7(2), 405–437.CrossRefGoogle Scholar
  23. Jung, E. S., Kee, D., & Chung, M. K. (1995). Upper body reach posture prediction for ergonomic evaluation models. International Journal of Industrial Ergonomics, 16(2), 95–107.CrossRefGoogle Scholar
  24. Kadaba, M. P., Ramakrishnan, H., & Wootten, M. (1990). Measurement of lower extremity kinematics during level walking. Journal of Orthopaedic Research, 8(3), 383–392.CrossRefGoogle Scholar
  25. Koga, Y., Kondo, K., Kuffner, J., & Latombe, J. C. (1994). Planning motions with intentions. In: Proceedings of the 21st Annual Conference on Computer Graphics and Interactive Techniques (pp. 395–408), ACM.Google Scholar
  26. Le, Q. A., Hanoune, S., & Pelachaud, C. (2011). Design and implementation of an expressive gesture model for a humanoid robot. In: 11th IEEE-RAS International Conference on Humanoid Robots (pp. 134–140), IEEE.Google Scholar
  27. Lin, Y. H., Liu, C. Y., Lee, H. W., Huang, S. L., & Li, T. Y. (2009). Evaluating emotive character animations created with procedural animation. In: Z. Ruttkay, M. Kipp, A. Nijholt, H. H. Vilhjálmsson (Eds.), Intelligent virtual agents (pp. 308–315), Springer.Google Scholar
  28. Matsui, D., Minato, T., MacDorman, K., & Ishiguro, H. (2005). Generating natural motion in an android by mapping human motion. In: IROS (pp. 3301–3308).Google Scholar
  29. Montepare, J. M., Goldstein, S. B., & Clausen, A. (1987). The identification of emotions from gait information. Journal of Nonverbal Behavior, 11(1), 33–42.CrossRefGoogle Scholar
  30. Mühlig, M., Gienger, M., & Steil, J. J. (2012). Interactive imitation learning of object movement skills. Autonomous Robots, 32(2), 97–114.CrossRefGoogle Scholar
  31. Park, E., Kim, K. J., & del Pobil, A. P. (2011). The effects of robot body gesture and gender in human–robot interaction. Human-Computer Interaction, 6, 91–96.Google Scholar
  32. Pelachaud, C. (2009). Studies on gesture expressivity for a virtual agent. Speech Communication, 51(7), 630–639.CrossRefGoogle Scholar
  33. Pollick, F. E., Paterson, H. M., Bruderlin, A., & Sanford, A. J. (2001). Perceiving affect from arm movement. Cognition, 82(2), B51–B61.CrossRefGoogle Scholar
  34. Posner, J., Russell, J., & Peterson, B. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology, 17, 715–734.CrossRefGoogle Scholar
  35. Salem, M., Kopp, S., Wachsmuth, I., & Joublin, F. (2009). Towards meaningful robot gesture. In: H. Ritter, G. Sagerer, R. Dillmann, M. Buss (Eds.), Human centered robot systems (pp. 173–182), Springer.Google Scholar
  36. Salem, M., Kopp, S., Wachsmuth, I., & Joublin, F. (2010). Generating multi-modal robot behavior based on a virtual agent framework. In: Proceedings of the ICRA 2010 Workshop on Interactive Communication for Autonomous Intelligent Robots (ICAIR).Google Scholar
  37. Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., & Joublin, F. (2013). To err is human (-like): Effects of robot gesture on perceived anthropomorphism and likability. International Journal of Social Robotics, 5(3), 313–323.CrossRefGoogle Scholar
  38. Scheutz, M., Schermerhorn, P., Kramer, J., & Anderson, D. (2007). First steps toward natural human-like hri. Autonomous Robots, 22(4), 411–423.CrossRefGoogle Scholar
  39. Sciavicco, L. (2009). Robotics: Modelling, planning and control. Berlin: Springer.Google Scholar
  40. Soechting, J. F., & Flanders, M. (1989). Errors in pointing are due to approximations in sensorimotor transformations. Journal of Neurophysiology, 62(2), 595–608.CrossRefGoogle Scholar
  41. Stanton, C., Bogdanovych, A., Ratanasena, E. (2012). Teleoperation of a humanoid robot using full-body motion capture, example movements, and machine learning. In: Proceedings of Australasian Conference on Robotics and Automation.Google Scholar
  42. Sugiyama, O., Kanda, T., Imai, M., Ishiguro, H., & Hagita, N. (2007). Natural deictic communication with humanoid robots. IROS, 2007, 1441–1448.Google Scholar
  43. Tapus, A., Peca, A., Aly, A., Pop, C., Jisa, L., Pintea, S., et al. (2012). Children with autism social engagement in interaction with nao, an imitative robot. a series of single case experiments. Interaction Studies, 13(3), 315–347.CrossRefGoogle Scholar
  44. Terlemez, O., Ulbrich, S., Mandery, C., Do, M., Vahrenkamp, N., & Asfour, T. (2014). Master Motor Map (MMM)—Framework and toolkit for capturing, representing, and reproducing human motion on humanoid robots. In: 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids) (pp. 894–901), IEEE.Google Scholar
  45. Van de Perre, G., De Beir, A., Cao, H. L., Esteban, P. G., Lefeber, D., & Vanderborght, B. (2016). Reaching an pointing gestures calculated by a generic gesture system for social robots. Robotics and Autonomous Systems, 83, 32–43.Google Scholar
  46. Van de Perre, G., Van Damme, M., Lefeber, D., & Vanderborght, B. (2015). Development of a generic method to generate upper-body emotional expressions for different social robots. Advanced Robotics, 29(9), 59–609.Google Scholar
  47. Wallbott, H. G. (1998). Bodily expression of emotion. European Journal of Social Psychology, 28(6), 879–896.CrossRefGoogle Scholar
  48. Xu, J., Broekens, J., Hindriks, K., & Neerincx, M. A. (2013a). Mood expression through parameterized functional behavior of robots. In: IEEE International Sympossium on Robot and Human Interactive Communication (RO-MAN 2013) (pp. 533–540).Google Scholar
  49. Xu, J., Broekens, J., Hindriks, K., & Neerincx, M. A. (2013b). The relative importance and interrelations between behavior parameters for robots’ mood expression. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 558–563), IEEE.Google Scholar
  50. Yamaguchi, A., Yano, Y., Doki, S., & Okuma, S. (2006). A study of emotional motion description by motion modification and adjectival expressions. IEEE Conference on Cybernetics and Intelligent Systems, 2006, 1–6.Google Scholar
  51. Zecca, M., Mizoguchi, Y., Endo, K., Iida, F., Kawabata, Y., Endo, N., et al. (2009). Whole body emotion expressions for KOBIAN humanoid robot: Preliminary experiments with different emotional patterns. In: The 18th IEEE International Sympossium on Robot and Human Interactive Communication. RO-MAN 2009 (pp. 381–386).Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Robotics and Multibody Mechanics Research GroupVrije Universiteit BrusselBrusselsBelgium
  2. 2.Flanders MakeBrusselsBelgium

Personalised recommendations