It was a Pleasure Meeting You

Towards a Holistic Model of Human–Robot Encounters


Social signals are commonly used to facilitate the usability of humanoid robots. While providing the robot with an extended expressibility, these signals are often applied only in structured interactions where parts of the familiarization or farewell processes are disregarded in the evaluation. In order to establish the consideration of a more comprehensive view, this article presents a holistic model of human encounters with a social robot. We propose and discuss particular robot signals, which aim to express the robot’s social awareness, for each of the model’s phases. We present an interaction study with humans that are inexperienced in interacting with robots to investigate the effects of these signals. Results verify that the implementation of proposed signals is beneficial for the participants’ user experience. The study further reveals a strong interdependency of a robot’s social signals and the importance of addressing entire encounters in human–robot interaction studies.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11


  1. 1.

    Please note that participants experienced four different expressions of the third variable as explained in the text.


  1. 1.

    Andrist S, Tan XZ, Gleicher M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In: 2014 9th ACM/IEEE international conference on Human–robot interaction (HRI). IEEE, pp 25–32.

  2. 2.

    Barnes J, FakhrHosseini M, Jeon M, Park CH, Howard A (2017) The influence of robot design on acceptance of social robots. In: 2017 14th international conference on ubiquitous robots and ambient intelligence (URAI). IEEE, pp 51–55.

  3. 3.

    Bente G, Krämer NC (2001) Psychologische aspekte bei der implementierung und evaluation nonverbal agierender interface-agenten. Vieweg+Teubner Verlag, Wiesbaden, pp 275–285.

    Google Scholar 

  4. 4.

    Bernotat J, Schiffhauer B, Eyssel F, Holthaus P, Leichsenring C, Richter V, Pohling M, Carlmeyer B, Engelmann KF, Lier F, Schulz S, Bröhl R, Seibel E, Hellwig P, Cimiano P, Kummert F, Schlangen D, Wagner P, Hermann T, Wachsmuth S, Wrede B, Wrede S (2016) Welcome to the future–How naïve users intuitively address an intelligent robotics apartment. In: Agah A, Cabibihan JJ, Howard AM, Salichs MA, He H (eds) International conference on social robotics (ICSR 2016), vol 9979. Lecture notes in computer science. Springer, Berlin, pp 982–992.

    Google Scholar 

  5. 5.

    Boucher JD, Pattacini U, Lelong A, Bailly G, Elisei F, Fagel S, Dominey P, Ventre-Dominey J (2012) I reach faster when i see you look: Gaze effects in human–human and human–robot face-to-face cooperation. Front Neurorobot 6:3.

    Article  Google Scholar 

  6. 6.

    Braningan H, Pearson J (2006) Alignment in human–computer interaction. In: Fischer K (ed) How people talk to computers, robots, and other artificial communication partners. Universitat Bremen, Bremen, pp 140–156

    Google Scholar 

  7. 7.

    Breazeal C, Kidd CD, Thomaz AL, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: International conference on intelligent robots and systems. IEEE, pp 708–713.

  8. 8.

    Carpinella CM, Wyman AB, Perez MA, Stroessner SJ (2017) The robotic social attributes scale (rosas): Development and validation. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction, HRI ’17. Association for Computing Machinery, New York, NY, USA, pp 254–262.

  9. 9.

    Castro-González Á, Admoni H, Scassellati B (2016) Effects of form and motion on judgments of social robot’s animacy, likability, trustworthiness and unpleasantness. Int J Hum Comput Stud 90:27–38.

    Article  Google Scholar 

  10. 10.

    Crandall JW, Goodrich MA (2002) Characterizing efficiency of human robot interaction: a case study of shared-control teleoperation. In: International conference on intelligent robots and systems. IEEE/RSJ, pp 1290–1295.

  11. 11.

    Esposito A, Jain LC (2016) Toward robotic socially believable behaving systems-volume II: modeling social signals, vol 106. Springer, Berlin.

    Google Scholar 

  12. 12.

    Fischer K (2011) How people talk with robots: reduce user uncertainty. AI Mag 32(4):31–38.

    Article  Google Scholar 

  13. 13.

    Gehle R, Pitsch K, Dankert T, Wrede S (2017) How to open an interaction between robot and museum visitor? Strategies to establish a focused encounter in HRI. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction, HRI ’17. Association for Computing Machinery, New York, NY, USA, pp 187–195.

  14. 14.

    Ghosh M, Kuzuoka H (2013) A trial attempt by a museum guide robot to engage and disengage the audience on time. In: Winter international conference on engineering and technology. Atlantic Press, pp 18–22.

  15. 15.

    Gockley R, Bruce A, Forlizzi J, Michalowski M, Mundell A, Rosenthal S, Sellner B, Simmons R, Snipes K, Schultz AC, Wang J (2005) Designing robots for long-term social interaction. In: 2005 IEEE/RSJ international conference on intelligent robots and systems. pp 1338–1343.

  16. 16.

    Goodrich MA, Schultz AC (2007) Human–robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275.

    Article  MATH  Google Scholar 

  17. 17.

    Green A, Hüttenrauch H (2006) Making a case for spatial prompting in human–robot communication. In: Proceedings of the fifth international conference on language resources and evaluation (LREC2006) workshop: multimodal corpora: from multimodal behaviour theories to usable models

  18. 18.

    Guitton D, Volle M (1987) Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. J Neurophysiol 58:427–459.

    Article  Google Scholar 

  19. 19.

    Hall ET (1966) The hidden dimension. Doubleday, Garden City

    Google Scholar 

  20. 20.

    Hall ET, Birdwhistell RL, Bock B, Bohannan P, Richard Diebold A, Durbin M, Edmonson MS, Fischer JL, Hymes D, Kimball ST, Barre WL, Lynch F, McClellan JE, Marshall DS, Millner GB, Sarles HB, Trager GL, Vayda AP (1968) Proxemics. Curr Anthropol 9(2/3):83.

    Article  Google Scholar 

  21. 21.

    Heenan B, Greenberg S, Aghel-Manesh S, Sharlin E (2014) Designing social greetings in human robot interaction. In: Proceedings of the 2014 conference on designing interactive systems, DIS ’14. Association for Computing Machinery, New York, NY, USA, pp 855–864.

  22. 22.

    Hegel F, Gieselmann S, Peters A, Holthaus P, Wrede B (2011) Towards a typology of meaningful signals and cues in social robotics. In: International symposium on robot and human interactive communication. IEEE, Atlanta, Georgia, pp 72–78.

  23. 23.

    Holthaus P, Pitsch K, Wachsmuth S (2011) How can i help?—Spatial attention strategies for a receptionist robot. Int J Soc Robot 3(4):383–393.

    Article  Google Scholar 

  24. 24.

    Holthaus P, Wachsmuth S (2012) Active peripersonal space for more intuitive HRI. In: International conference on humanoid robots. IEEE-RAS, Osaka, Japan, pp 508–513.

  25. 25.

    Holthaus P, Wachsmuth S (2014) The receptionist robot. In: International conference on human–robot interaction. ACM/IEEE, Bielefeld, Germany, pp 329.

  26. 26.

    Jung J, Kanda T, Kim MS (2013) Guidelines for contextual motion design of a humanoid robot. Int J Soc Robot 5(2):153–169.

    Article  Google Scholar 

  27. 27.

    Kampe KKW, Frith CD, Dolan RJ, Frith U (2001) Psychology: reward value of attractiveness and gaze. Nature.

    Article  Google Scholar 

  28. 28.

    Kendon A (1967) Some functions of gaze-direction in social interaction. Acta Psychol 26:22–63.

    Article  Google Scholar 

  29. 29.

    Kendon A (1990) Conducting interaction: patterns of social behavior in focused encounters. Cambridge University Press, Cambridge

    Google Scholar 

  30. 30.

    Kendon A (1997) Gesture. Ann Rev Anthropol 26:109–128.

    Article  Google Scholar 

  31. 31.

    Knapp M, Hall J, Horgan T (2013) Nonverbal communication in human interaction. Cengage Learning, Boston

    Google Scholar 

  32. 32.

    Krämer NC (2005) Social communicative effects of a virtual program guide. In: Panayiotopoulos T, Gratch J, Aylett R, Ballin D, Olivier P, Rist T (eds) Intelligent virtual agents, vol 3661. Lecture notes in computer science. Springer, Berlin Heidelberg, pp 442–453.

    Google Scholar 

  33. 33.

    Kruskal WH, Wallis WA (1952) Use of ranks in one-criterion variance analysis. J Am Stat Assoc 47(260):583–621.

    Article  MATH  Google Scholar 

  34. 34.

    Lee MK, Kiesler S, Forlizzi J (2010) Receptionist or information kiosk: How do people talk with a robot? In: Computer supported cooperative work. ACM, pp. 31–40.

  35. 35.

    Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308.

    Article  Google Scholar 

  36. 36.

    Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 140:1–55

    Google Scholar 

  37. 37.

    Lindner F, Eschenbach C (2014) Affordances and affordance space: a conceptual framework for application in social robotics. In: Seibt J, Hakli R, Norskov M (eds) Social robots and the future of social relations, frontiers in artificial Intelligence and applications, vol 273, pp 34–45.

  38. 38.

    Lohse M (2011) The role of expectations and situations in human–robot interaction. In: Dautenhahn K, Saunders J (eds) New frontiers in human–robot interaction, vol 2. Advances in interaction studies. John Benjamins, Amsterdam, pp 35–56.

    Google Scholar 

  39. 39.

    Lütkebohle I, Peltason J, Schillingmann L, Elbrechter C, Wrede B, Wachsmuth S, Haschke R (2009) The curious robot—structuring interactive robot learning. In: International conference on robotics and automation. IEEE, pp 4156–4162.

  40. 40.

    Makatchev M, Simmons R, Sakr M, Ziadee M (2013) Expressing ethnicity through behaviors of a robot character. In: International conference on human–robot interaction. ACM/IEEE, pp 357–364.

  41. 41.

    Mehrabian A, Ferris SR (1967) Inference of attitudes from nonverbal communication in two channels. J Consult Psychol 31(3):248.

    Article  Google Scholar 

  42. 42.

    Mejia C, Kajikawa Y (2017) Bibliometric analysis of social robotics research: identifying research trends and knowledgebase. Appl Sci 7(12):1316.

    Article  Google Scholar 

  43. 43.

    Metta G, Natale L, Nori F, Sandini G, Vernon D, Fadiga L, Von Hofsten C, Rosander K, Lopes M, Santos-Victor J, Bernardino A, Montesano L (2010) The iCub humanoid robot: an open-systems platform for research in cognitive development. Neural Netw 23(8–9):1125–1134.

    Article  Google Scholar 

  44. 44.

    Michalowski M, Sabanovic S, Simmons R (2006) A spatial model of engagement for a social robot. In: International workshop on advanced motion control. IEEE, pp 762–767.

  45. 45.

    Mondada L (2009) Emergent focused interactions in public places: a systematic analysis of the multimodal achievement of a common interactional space. J Pragmat 41(10):1977–1997.

    Article  Google Scholar 

  46. 46.

    Pattacini U (2011) Modular Cartesian controllers for humanoid robots: design and implementation on the iCub. Ph.D. thesis, University of Genoa

  47. 47.

    Patterson ML (1982) A sequential functional model of nonverbal exchange. Psychol Rev 89(3):231–249.

    Article  Google Scholar 

  48. 48.

    Pearson K (1900) X. On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling. Philos Mag Ser 50(302):157–175.

    Article  MATH  Google Scholar 

  49. 49.

    Peltason J, Lütkebohle I, Wrede B, Hanheide M (2009) Mixed-initiative in interactive robotic learning. In: Workshop on “improving human-robot communication with mixed-initiative and context-awareness” at international symposium on robot and human interactive communication

  50. 50.

    Peters A (2011) Small movements as communicational cues in HRI. In: “Pioneers” workshop at international conference on human–robot interaction

  51. 51.

    Pinheiro PG, Ramos JJ, Donizete VL, Picanço P, De Oliveira GH (2017) Workplace emotion monitoring—an emotion-oriented system hidden behind a receptionist robot. In: Mechatronics and robotics engineering for advanced and intelligent manufacturing. Springer, pp 407–420.

  52. 52.

    Pitsch K, Kuzuoka H, Suzuki Y, Sussenbach L, Luff P, Heath C (2009) “The first five seconds”: contingent stepwise entry into an interaction as a means to secure sustained engagement in HRI. pp 985 – 991.

  53. 53.

    Pitsch K, Wrede S, Seele Jc, Süssenbach L (2011) Attitude of German museum visitors towards an interactive art guide robot. In: International conference on human–robot interaction. ACM/IEEE, pp 227–228.

  54. 54.

    Rollet N, Licoppe C (2019) Why (pre)closing matters. the case of human–robot interaction. In: Mensch und computer, workshopband. Gesellschaft für Informatik e.V, Bonn.

  55. 55.

    Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2013) To err is human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int J Soc Robot 5(3):313–323.

    Article  Google Scholar 

  56. 56.

    Salem M, Ziadee M, Sakr M (2013) Effects of politeness and interaction context on perception and experience of HRI. In: Herrmann G, Pearson M, Lenz A, Bremner P, Spiers A, Leonards U (eds) Social robotics, vol 8239. lecture notes in computer science. Springer, Berlin, pp 531–541.

    Google Scholar 

  57. 57.

    Satake S, Kanda T, Glas DF, Imai M, Ishiguro H, Hagita N (2009) How to approach humans? Strategies for social robots to initiate interaction. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, HRI ’09. Association for Computing Machinery, New York, NY, USA, pp 109–116.

  58. 58.

    Schegloff EA (2002) Opening Sequencing. In: Katz JE, Aakhus MA (eds) Perpetual contact: Mobile communication, private talk, public performance. Cambridge University Press, Cambridge, pp 326–385

    Google Scholar 

  59. 59.

    Schulz T, Soma R, Holthaus P (2021) Learning lessons from breakdowns -participants’ opinions on a breakdown situation with a robot Paladyn. Int J Soc Behav Robot Spec Issue Trust. Acceptance and Social Cues in Robot Interaction (in preparation)

  60. 60.

    Sheridan TB (1997) Eight ultimate challenges of human–robot communication. In: International workshop on robot and human communication. IEEE, pp 9–14.

  61. 61.

    Shiwa T, Kanda T, Imai M, Ishiguro H, Hagita N (2008) How quickly should communication robots respond? In: Proceedings of the 3rd ACM/IEEE international conference on human robot interaction, HRI ’08. Association for Computing Machinery, New York, NY, USA, pp 153–160.

  62. 62.

    Simmons R, Makatchev M, Kirby R, Lee MK, Fanaswala I, Browning B, Forlizzi J, Sakr M (2011) Believable robot characters. AI Mag 32(4):39–52.

    Article  Google Scholar 

  63. 63.

    Smith JM, Harper D et al (2003) Animal signals. Oxford University Press, Oxford

    Google Scholar 

  64. 64.

    Stubbs K, Wettergreen D, Hinds PJ (2007) Autonomy and common ground in human–robot interaction: a field study. Intell Syst 22(2):42–50.

    Article  Google Scholar 

  65. 65.

    Trovato G, Ramos JG, Azevedo H, Moroni A, Magossi S, Simmons R, Ishii H, Takanishi A (2017) A receptionist robot for Brazilian people: study on interaction involving illiterates. Paladyn J Behav Robot 8(1):1–17.

    Article  Google Scholar 

  66. 66.

    Wachsmuth I, de Ruiter J, Jaecks P, Kopp S (eds) (2013) Alignment in communication: towards a new theory of communication, vol 6. Advances in interaction studies. John Benjamins Publishing, Boston

    Google Scholar 

Download references


This work has been funded in part by the German Research Foundation (DFG) within the Collaborative Research Center 673, Alignment in Communication.

Author information



Corresponding author

Correspondence to Patrick Holthaus.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Holthaus, P., Wachsmuth, S. It was a Pleasure Meeting You. Int J of Soc Robotics (2021).

Download citation


  • Nonverbal signals
  • Holistic human–robot encounters
  • User study