Skip to main content
Log in

Generic method for generating blended gestures and affective functional behaviors for social robots

Autonomous Robots Aims and scope Submit manuscript

Abstract

Gesturing is an important modality in human–robot interaction. Up to date, gestures are often implemented for a specific robot configuration and therefore not easily transferable to other robots. To cope with this issue, we presented a generic method to calculate gestures for social robots. The method was designed to work in two modes to allow the calculation of different types of gestures. In this paper, we present the new developments of the method. We discuss how the two working modes can be combined to generate blended emotional expressions and deictic gestures. In certain situations, it is desirable to express an emotional condition through an ongoing functional behavior. Therefore, we implemented the possibility of modulating a pointing or reaching gesture into an affective gesture by influencing the motion speed and amplitude of the posture. The new implementations were validated on virtual models with different configurations, including those of the robots NAO and Justin.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Alissandrakis, A., Nehaniv, C. L., & Dautenhahn, K. (2002). Imitation with alice: Learning to imitate corresponding actions across dissimilar embodiments. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 32(4), 482–496.

    Article  Google Scholar 

  • Amaya, K., Bruderlin, A., & Calvert, T. (1996). Emotion from motion. Graphics Interface, 96, 222–229.

    Google Scholar 

  • Andry, P., Gaussier, P., Moga, S., Banquet, J., & Nadel, J. (2001). Learning and communication via imitation: An autonomous robot perspective. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 31, 431–442.

    Article  Google Scholar 

  • Ascher, U. M., & Petzold, L. R. (1998). Computer methods for ordinary differential equations and differential-algebraic equations. Philadelphia: Siam.

    Book  MATH  Google Scholar 

  • Atkinson, A. P., Dittrich, W. H., Gemmell, A. J., Young, A. W., et al. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33, 717–746.

    Article  Google Scholar 

  • Azad, P., Asfour, T., & Dillmann, R. (2007). Toward an unified representation for imitation of human motion on humanoids. In: IEEE International Conference on Robotics and Automation, (pp. 2558–2563), IEEE.

  • Balit, E., Vaufreydaz, D., & Reignier, P. (2016). Integrating animation artists into the animation design of social robots. In: ACM/IEEE Human–Robot Interaction 2016 (pp 417–418)

  • Belpaeme, T., Baxter, P. E., Read, R., Wood, R., Cuayáhuitl, H., Kiefer, B., et al. (2012). Multimodal child-robot interaction: Building social bonds. Journal of Human–Robot Interaction, 1(2), 33–53.

    Google Scholar 

  • Breazeal, C., Kidd, C. D., Thomaz, A. L., Hoffman, G., & Berlin, M. (2005). Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2005), (pp 708–713).

  • Castellano, G., Villalba, S. D., & Camurri, A. (2007). Recognising human emotions from body movement and gesture dynamics. In: A.C.R. Paiva, R. Prada, R.W. Picard (Eds.), Affective computing and intelligent interaction (pp .71–82), Springer.

  • Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior, 28(2), 117–139.

    Article  MathSciNet  Google Scholar 

  • Crane, E., & Gross, M. (2007), Motion capture and emotion: Affect detection in whole body movement. In: A.C.R. Paiva, R. Prada, R.W. Picard (Eds.), Affective computing and intelligent interaction (pp. 95–101), Springer.

  • Dautenhahn, K., & Nehaniv, C. L. (2002). The correspondence problem. Cambridge: MIT Press.

    MATH  Google Scholar 

  • De Meijer, M. (1989). The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13(4), 247–268.

    Article  Google Scholar 

  • Dittrich, W. H., Troscianko, T., Lea, S. E., & Morgan, D. (1996). Perception of emotion from dynamic point-light displays represented in dance. Perception, 25(6), 727–738.

    Article  Google Scholar 

  • Do, M., Azad, P., Asfour, T., & Dillmann, R. (2008). Imitation of human motion on a humanoid robot using non-linear optimization. 8th IEEE-RAS International Conference on Humanoid Robots, Humanoid (pp. 545–552).

  • Gienger M., Janssen H., & Goerick C. (2005). Task-oriented whole body motion for humanoid robots. In: 2005 5th IEEE-RAS International Conference on Humanoid Robots (pp. 238–244), IEEE.

  • Hild, M., Siedel, T., Benckendorff, C., Thiele, C., & Spranger, M. (2012). Myon, a new humanoid. In: L. Steels, M. Hild (Eds.), Language grounding in robots (pp. 25–44), Springer.

  • Hirukawaa, H., Kanehiroa, F., Kanekoa, K., Kajitaa, S., Fujiwaraa, K., Kawaia, Y., et al. (2004). Humanoid robotics platforms developed in HRP. Robotics and Autonomous Systems, 48(4), 165–175.

    Article  Google Scholar 

  • Ido, J., Matsumoto, Y., Ogasawara, T., & Nisimura, R. (2006). Humanoid with interaction ability using vision and speech information. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2006) (pp 1316–1321).

  • Itoh, K., Miwa, H., Matsumoto, M., Zecca, M., Takanobu, H., Roccella S., et al. (2004). Various emotional expressions with emotion expression humanoid robot WE-4RII. In: IEEE Technical Exhibition Based Conference on Robotics and Automation (pp. 35–36).

  • James, W. T. (1932). A study of the expression of bodily posture. The Journal of General Psychology, 7(2), 405–437.

    Article  Google Scholar 

  • Jung, E. S., Kee, D., & Chung, M. K. (1995). Upper body reach posture prediction for ergonomic evaluation models. International Journal of Industrial Ergonomics, 16(2), 95–107.

    Article  Google Scholar 

  • Kadaba, M. P., Ramakrishnan, H., & Wootten, M. (1990). Measurement of lower extremity kinematics during level walking. Journal of Orthopaedic Research, 8(3), 383–392.

    Article  Google Scholar 

  • Koga, Y., Kondo, K., Kuffner, J., & Latombe, J. C. (1994). Planning motions with intentions. In: Proceedings of the 21st Annual Conference on Computer Graphics and Interactive Techniques (pp. 395–408), ACM.

  • Le, Q. A., Hanoune, S., & Pelachaud, C. (2011). Design and implementation of an expressive gesture model for a humanoid robot. In: 11th IEEE-RAS International Conference on Humanoid Robots (pp. 134–140), IEEE.

  • Lin, Y. H., Liu, C. Y., Lee, H. W., Huang, S. L., & Li, T. Y. (2009). Evaluating emotive character animations created with procedural animation. In: Z. Ruttkay, M. Kipp, A. Nijholt, H. H. Vilhjálmsson (Eds.), Intelligent virtual agents (pp. 308–315), Springer.

  • Matsui, D., Minato, T., MacDorman, K., & Ishiguro, H. (2005). Generating natural motion in an android by mapping human motion. In: IROS (pp. 3301–3308).

  • Montepare, J. M., Goldstein, S. B., & Clausen, A. (1987). The identification of emotions from gait information. Journal of Nonverbal Behavior, 11(1), 33–42.

    Article  Google Scholar 

  • Mühlig, M., Gienger, M., & Steil, J. J. (2012). Interactive imitation learning of object movement skills. Autonomous Robots, 32(2), 97–114.

    Article  Google Scholar 

  • Park, E., Kim, K. J., & del Pobil, A. P. (2011). The effects of robot body gesture and gender in human–robot interaction. Human-Computer Interaction, 6, 91–96.

    Google Scholar 

  • Pelachaud, C. (2009). Studies on gesture expressivity for a virtual agent. Speech Communication, 51(7), 630–639.

    Article  Google Scholar 

  • Pollick, F. E., Paterson, H. M., Bruderlin, A., & Sanford, A. J. (2001). Perceiving affect from arm movement. Cognition, 82(2), B51–B61.

    Article  Google Scholar 

  • Posner, J., Russell, J., & Peterson, B. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology, 17, 715–734.

    Article  Google Scholar 

  • Salem, M., Kopp, S., Wachsmuth, I., & Joublin, F. (2009). Towards meaningful robot gesture. In: H. Ritter, G. Sagerer, R. Dillmann, M. Buss (Eds.), Human centered robot systems (pp. 173–182), Springer.

  • Salem, M., Kopp, S., Wachsmuth, I., & Joublin, F. (2010). Generating multi-modal robot behavior based on a virtual agent framework. In: Proceedings of the ICRA 2010 Workshop on Interactive Communication for Autonomous Intelligent Robots (ICAIR).

  • Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., & Joublin, F. (2013). To err is human (-like): Effects of robot gesture on perceived anthropomorphism and likability. International Journal of Social Robotics, 5(3), 313–323.

    Article  Google Scholar 

  • Scheutz, M., Schermerhorn, P., Kramer, J., & Anderson, D. (2007). First steps toward natural human-like hri. Autonomous Robots, 22(4), 411–423.

    Article  Google Scholar 

  • Sciavicco, L. (2009). Robotics: Modelling, planning and control. Berlin: Springer.

    Google Scholar 

  • Soechting, J. F., & Flanders, M. (1989). Errors in pointing are due to approximations in sensorimotor transformations. Journal of Neurophysiology, 62(2), 595–608.

    Article  Google Scholar 

  • Stanton, C., Bogdanovych, A., Ratanasena, E. (2012). Teleoperation of a humanoid robot using full-body motion capture, example movements, and machine learning. In: Proceedings of Australasian Conference on Robotics and Automation.

  • Sugiyama, O., Kanda, T., Imai, M., Ishiguro, H., & Hagita, N. (2007). Natural deictic communication with humanoid robots. IROS, 2007, 1441–1448.

    Google Scholar 

  • Tapus, A., Peca, A., Aly, A., Pop, C., Jisa, L., Pintea, S., et al. (2012). Children with autism social engagement in interaction with nao, an imitative robot. a series of single case experiments. Interaction Studies, 13(3), 315–347.

    Article  Google Scholar 

  • Terlemez, O., Ulbrich, S., Mandery, C., Do, M., Vahrenkamp, N., & Asfour, T. (2014). Master Motor Map (MMM)—Framework and toolkit for capturing, representing, and reproducing human motion on humanoid robots. In: 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids) (pp. 894–901), IEEE.

  • Van de Perre, G., De Beir, A., Cao, H. L., Esteban, P. G., Lefeber, D., & Vanderborght, B. (2016). Reaching an pointing gestures calculated by a generic gesture system for social robots. Robotics and Autonomous Systems, 83, 32–43.

  • Van de Perre, G., Van Damme, M., Lefeber, D., & Vanderborght, B. (2015). Development of a generic method to generate upper-body emotional expressions for different social robots. Advanced Robotics, 29(9), 59–609.

    Google Scholar 

  • Wallbott, H. G. (1998). Bodily expression of emotion. European Journal of Social Psychology, 28(6), 879–896.

    Article  Google Scholar 

  • Xu, J., Broekens, J., Hindriks, K., & Neerincx, M. A. (2013a). Mood expression through parameterized functional behavior of robots. In: IEEE International Sympossium on Robot and Human Interactive Communication (RO-MAN 2013) (pp. 533–540).

  • Xu, J., Broekens, J., Hindriks, K., & Neerincx, M. A. (2013b). The relative importance and interrelations between behavior parameters for robots’ mood expression. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 558–563), IEEE.

  • Yamaguchi, A., Yano, Y., Doki, S., & Okuma, S. (2006). A study of emotional motion description by motion modification and adjectival expressions. IEEE Conference on Cybernetics and Intelligent Systems, 2006, 1–6.

    Google Scholar 

  • Zecca, M., Mizoguchi, Y., Endo, K., Iida, F., Kawabata, Y., Endo, N., et al. (2009). Whole body emotion expressions for KOBIAN humanoid robot: Preliminary experiments with different emotional patterns. In: The 18th IEEE International Sympossium on Robot and Human Interactive Communication. RO-MAN 2009 (pp. 381–386).

Download references

Acknowledgements

The first author is funded by the Fund for Scientific Research (FWO) Flanders. This work is partially funded by the EU-Project DREAM (611391). Robotics and Multibody Mechanics Research Group is partner of the Agile and Human Centered Production and Robotic Systems Research Priority of Flanders Make. The authors would like to thank DLR for sharing the virtual model of Justin.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Greet Van de Perre.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Van de Perre, G., Cao, HL., De Beir, A. et al. Generic method for generating blended gestures and affective functional behaviors for social robots. Auton Robot 42, 569–580 (2018). https://doi.org/10.1007/s10514-017-9650-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-017-9650-0

Keywords

Navigation