Abstract
Head nods have been shown to play an important role for communication management in human communication, e.g. as a non-verbal feedback signal from the listener. Based on a study with virtual agents, which showed that the use of head nods helps eliciting more verbal input from the user, we investigate the use of head nods in communications between a user and a humanoid robot (Nao) that they meet for the first time. Contrary to the virtual agent case, the robot elicited less talking from the user when it was using head nods as a feedback signal. A follow-up experiment revealed that the physical embodiment of the robot had a huge impact on the users’ behavior in the first encounters.
Keywords
References
Paggio, P., Navaretta, C.: Feedback and gestural behaviour in a conversational corpus of Danish. In: Proceedings of the 3rd Nordic Symposium on Multimodal Communication NEALT (2011), pp. 33–39 (2011)
Paggio, P., Navarretta, C.: Head Movements, Facial Expressions and Feedback in Danish First Encounters Interactions: A Culture-Specific Analysis. In: Stephanidis, C. (ed.) Universal Access in HCI, Part II, HCII 2011. LNCS, vol. 6766, pp. 583–590. Springer, Heidelberg (2011)
Bartneck, C., Kuli, D., Croft, E., Zoghbi, S.: Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics 1, 71–81 (2009)
Francis, L., Brown, J., Philipchalk, L.B., The, R.: The development of an abbreviated form of the revised eysenck personality questionnaire (EPQR-A): Its use among students in England, Canada, the U.S.A. and Australia. Personality and Individual Differences 13(4), 443–449 (1992)
Gunes, H., Heylen, D., ter Maat, M., McKeown, G., Pammi, S., Pantic, M., Pelachaud, C., Schuller, B., de Sevin, E., Valstar, M., Wöllmer, M.: Building Autonomous Sensitive Artificial Listeners. IEEE Transactions on Affecite Computing 3, 165–183 (2012)
Yoichi, S., Yuuko, N., Kiyoshi, Y., Yukiko, N.: Listener Agent for Elderly People with Dementia. In: Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 199–200. ACM, New York (2012)
Meguro, T., Higashinaka, R., Dohsaka, K., Minami, Y., Isozaki, H.: Analysis of listening-oriented dialogue for building listening agents. In: Proceedings of the SIGDIAL 2009 Conference, pp. 124–127. Association for Computational Linguistics, Stroudsburg (2009)
Rich, C., Ponsler, B., Holroyd, A., Sidner, C.: Recognizing engagement in human-robot interaction. In: Proceedings of HRI, pp. 375–382. Institute of Electrical and Electronics Engineers (IEEE) (2010)
Riek, L.D., Paul, P.C., Robinson, P.: When my robot smiles at me Enabling human-robot rapport via realtime head gesture mimicry. Journal of Multimodal User Interfaces, vol 3, 99–108 (2010)
Liu, C., Ishi, C., Ishiguro, H., Hagita, N.: Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In: Human-Robot Interaction, pp. 285–292. ACM (2012)
Koda, T., Kishi, H., Hamamoto, T., Suzuki, Y.: Cultural Study on Speech Duration and Perception of Virtual Agent’s Nodding. In: Nakano, Y., Neff, M., Paiva, A., Walker, M. (eds.) IVA 2012. LNCS (LNAI), vol. 7502, pp. 404–411. Springer, Heidelberg (2012)
Dittmann, A., Llewellyn, L.: Relationship between vocalizations and head nods as listener responses. Journal of Personality and Social Psychology 9, 79–84 (1968)
McClave, E.: Linguistic functions of head movements in the context of speech. Journal of Pragmatics 32, 855–878 (2000)
Allwood, J., Cerrato, L.: A study of gestural feedback expressions. In: Paggio, P., Jokinen, K., Jönsso, A. (eds.) First Nordic Symposium on Multimodal Communication, Copenhagen, pp. 7–22 (2003)
Paggio, P., Navarretta, C.: Head movements, facial expressions and feedback in conversations: Empirical evidence from Danish multimodal data. Journal on Multimodal User Interfaces 7, 29–37 (2013)
Hadar, U., Steiner, T., Rose, F.: Head movement during listening turns in conversation. Journal of Nonverbal Behavior 9, 214–228 (1985)
Maynard, S.: Interactional functions of a nonverbal sign Head movement in japanese dyadic casual conversation. Journal of Pragmatics 11, 589–606 (1987)
Heylen, D.: Challenges ahead: head movements and other social acts during conversations. In: Joint Symposium on Virtual Social Agents. AISB (2005)
Kogure, M.: Nodding and smiling in silence during the loop sequence of backchannels in Japanese conversation. Journal of Pragmatics 39, 1275–1289 (2007)
Boholm, M., Allwood, J.: Repeated head movements, their function and relation to speech. In: Kipp, M., Martin, J., Paggio, P., Heylen, D. (eds.) Proceedings of the Workshop on Multimodal Corpora: Advances in Capturing, Coding and Analyzing Multimodality, pp. 6–10. LREC (2010)
Lepri, B., Subramanian, R., Kalimeri, K., Staiano, J., Pianesi, F., Sebe, N.: Employing social gaze and speaking activity for automatic determination of the extraversion trait. In: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction, p. 7. ACM (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Krogsager, A., Segato, N., Rehm, M. (2014). Backchannel Head Nods in Danish First Meeting Encounters with a Humanoid Robot: The Role of Physical Embodiment. In: Kurosu, M. (eds) Human-Computer Interaction. Advanced Interaction Modalities and Techniques. HCI 2014. Lecture Notes in Computer Science, vol 8511. Springer, Cham. https://doi.org/10.1007/978-3-319-07230-2_62
Download citation
DOI: https://doi.org/10.1007/978-3-319-07230-2_62
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-07229-6
Online ISBN: 978-3-319-07230-2
eBook Packages: Computer ScienceComputer Science (R0)