Corporate Robot Motion Identity

  • Jakob ReinhardtEmail author
  • Jonas Schmidtler
  • Klaus Bengler
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 823)


Mobile robotic systems are increasingly merging into human dominated areas and therefore will interact and coordinate with pedestrians in private and public spaces. To ease intuitive coordination in human-robot interaction, robots should be able to express intent via motion. This will enable an observer to quickly and confidently infer the robot’s goal to establish productive encounters. For long-term interaction, trajectories in straight drive or curvature have been optimized for this purpose. In addition, short-term movement cues are perceivable changes in motion parameters and direction of movement that can be utilized to express intent in a non-verbal manner. For example, yielding priority to a person via a short back-off movement cue, as opposed to merely a stop, provides the possibility of legible and agreeable robot navigation. In the service design domain, the front line personnel’s behavior is a crucial quality factor of how an organization is perceived by customers and society. Recent developments show that mobile robotic systems are increasingly supplementing a service company’s front line personnel. Companies such as Starship Technologies or Deutsche Post apply service robots for transportation purposes. Integrating robot motion into an organization’s visual identity to communicate the visual cues of what the organization wants to express could contribute to the customer experience. In order to provide movement cues that are not only legible, but convey an inherent personality of the robot carrying out the task and therefore reflect on the organization’s public image, we discuss aforementioned factors for consideration when developing a corporate robot motion identity. We integrate service quality domains and affected human roles for application in the creative practice of designing motion. Thus, recognizable movement cues which are designed to express intent to coexisting and cooperating pedestrians in an everyday context can be tailored to what an organization wants to express to its environment, customers or other stakeholders.


Service robots Corporate identity Service design Human-robot interaction Motion planning Legibility Movement cues 


  1. 1.
    Fortin J (2017) Delta Passenger Restrained After Trying to Open Exit Door, Charges Say. New York Times, 08 July 2017Google Scholar
  2. 2.
    Haag M (2017) Video Shows Airport Attendant Punching Passenger Holding a Child. New York Times, 31 July 2017Google Scholar
  3. 3.
    Victor D, Stevens, M (2017) United Airlines Passenger is Dragged From an Overbooked Flight. New York Times, 10 April 2017Google Scholar
  4. 4.
    VandenBos G (2007) APA Dictionary of Psychology, 2nd edn.Google Scholar
  5. 5.
    Balmer JMT (2001) Corporate identity, corporate branding and corporate marketing - seeing through the fog. Eur J Mark 35(3/4):248–291CrossRefGoogle Scholar
  6. 6.
    Mager B (2009) Touchpoint. J Serv Des 1(1):20–29Google Scholar
  7. 7.
    Andreassen TW, Kristensson P, Lervik-Olsen L, Parasuraman A, McColl-Kennedy JR, Edvardsson B, Colurcio M (2016) Linking service design to value creation and service research. J Serv Manag 27(1):21–29CrossRefGoogle Scholar
  8. 8.
    Zomerdijk LG, Voss CA (2010) Service design for experience-centric services. J Serv Res 44:1–41Google Scholar
  9. 9.
    Parasuraman LA, Zeithaml VA, Berry LL (1988) SERVQUAL: a multi-item scale for measuring consumer perceptions of the service quality. J Retail 64(1):12–40Google Scholar
  10. 10.
    Hall ET (1969) The hidden dimension. Doubleday & Company, Inc. Anchor Books, Garden City, New YorkGoogle Scholar
  11. 11.
    Hancock PA, Billings DR, Schaefer KE (2011) Can you trust your robot? Ergon Des 19(3):24–29Google Scholar
  12. 12.
    International Federation of Robotics (2017) Executive summary world robotics 2017 service robots. In: World robotic report - executive summary, pp 12–19Google Scholar
  13. 13.
    International Federation of Robotics (2016) World Robotics Report 2016 Service Robots, VDMA, 2016, pp 9–12Google Scholar
  14. 14.
    DIN EN ISO 8373:2012 - Robots and robotic devices - Vocabulary, no. November 2010 (2011)Google Scholar
  15. 15.
    Goodrich MA, Schultz AC (2007) Human-robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275CrossRefGoogle Scholar
  16. 16.
    Ju W (2015) The design of implicit interactions. Synth Lect Hum Centered Inf 8(2):1–93CrossRefGoogle Scholar
  17. 17.
    Schubö A, Vesper C, Wiesbeck M, Stork S (2007) Movement coordination in applied human-human and human-robot interaction. In: HCI and usability for medicine and health care, pp 143–154Google Scholar
  18. 18.
    Lauckner M, Kobiela F, Manzey D (2014) ‘Hey robot, please step back!’- exploration of a spatial threshold of comfort for human-mechanoid spatial interaction in a hallway scenario. In: IEEE RO-MAN, pp 780–787Google Scholar
  19. 19.
    Dragan AD, Lee KCT, Srinivasa SS (2013) Legibility and predictability of robot motion. In: IEEE/ACM conference on human-robot interactionGoogle Scholar
  20. 20.
    Lichtenthäler C, Kirsch A (2016) Legibility of robot behavior: a literature review. HAL-archives <hal-01306977>Google Scholar
  21. 21.
    Rios-Martinez J, Spalanzani A, Laugier C (2015) From proxemics theory to socially-aware navigation: a survey. Int J Soc Robot 7(2):137–153CrossRefGoogle Scholar
  22. 22.
    Schmidtler J, Knott V, Hölzel C, Bengler K (2015) Human centered assistance applications for the working environment of the future. Occup ErgonGoogle Scholar
  23. 23.
    Scholtz J (2003) Theory and evaluation of human robot interactions. In: Proceedings of the 36th annual hawaii international conference on system sciences 2003, vol 3, p 10Google Scholar
  24. 24.
    Hancock PA (2017) Imposing limits on autonomous systems. Ergonomics 60(2):284–291CrossRefGoogle Scholar
  25. 25.
    Bartneck C, Forlizzi J (2004) A design-centred framework for social human-robot interaction. RO-MAN 2004:591–594Google Scholar
  26. 26.
    Barraquand R, De Europe A, Ismier S, Crowley JL (2008) Learning polite behavior with situation models. In: ACM/IEEE international conference on human-robot interact, pp 209–216Google Scholar
  27. 27.
    Mehu M, Scherer KR (2012) A psycho-ethological approach to social signal processing. Cogn Process 13(2):397–414CrossRefGoogle Scholar
  28. 28.
    Moon A, Parker CAC, Croft EA, Van Der Loos HFM (2011) Did you see it hesitate empirically grounded design of hesitation trajectories for collaborative robots. IEEE/RSJ Intell Robots Syst (IROS) 2011:3–8Google Scholar
  29. 29.
    Vinciarelli A, Pantic M, Bourlard H, Pentland A (2008) Social signals, their function, and automatic analysis: a survey. In: Proceedings of the 10th international conference on multimodal interfaces - IMCI 2008, pp 61–68Google Scholar
  30. 30.
    Reinhardt J, Pereira A, Beckert D, Bengler K (2017) Dominance and movement cues of robot motion: a user study on trust and predictability. In: IEEE International Conference on Systems, Man, and Cybernetics 2017, pp 1493–1498Google Scholar
  31. 31.
    Hogan K, Stubbs R (2003) Can’t get through: eight barriers to communication. Pelican Publishing, GretnaGoogle Scholar
  32. 32.
    Rosenbaum D (2009) Human motor control. Academic Press, New YorkGoogle Scholar
  33. 33.
    Kruse T, Kirsch A, Khambhaita H, Alami R (2014) Evaluating directional cost models in navigation. In: Proceedings of the 2014 ACM/IEEE international conference on human-robot interaction - HRI 2014, pp 350–357Google Scholar
  34. 34.
    Kitazawa K, Fujiyama T (2010) Pedestrian vision and collision avoidance behaviour: investigation of the information process space of pedestrians using an eye tracker. Pedestr Evacuation Dyn 2008:95–108Google Scholar
  35. 35.
    Knight H, Thielstrom R, Simmons R (2016) Expressive path shape (swagger): simple features that illustrate a robot’s attitude towards its goal in real time. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 1475–1482Google Scholar
  36. 36.
    Reinhardt J, Schmidtler J, Körber M, Bengler K (2016) Follow me! wie roboter menschen führen sollen. Z Arbeitswiss 70(4):203–210CrossRefGoogle Scholar
  37. 37.
    Garrell A, Sanfeliu A (2010) Model validation: robot behavior in people guidance mission using dtm model and estimation of human motion behavior. In: IEEE RSJ International Conference on Intelligent Robots and Systems, pp 18–22Google Scholar
  38. 38.
    Müller J, Stachniss C, Arras KO, Burgard W (2008) Socially inspired motion planning for mobile robots in populated environments. In: International Conference on Cognitive Systems (CogSys), 2008 January, pp 85–90Google Scholar
  39. 39.
    Fink J, Lemaignan S, Dillenbourg P, Rétornaz P, Vaussard FC, Berthoud A, Mondada F, Wille F, Franinovic K (2014) Which robot behavior can motivate children to tidy up their toys? design and evaluation of ‘Ranger’. In: ACM/IEEE international conference on human-robot interaction 2014, pp 439–446Google Scholar
  40. 40.
    Google (2018) Google material design.
  41. 41.
  42. 42.
    Fuest T, Sorokin L, Bellem H, Bengler K (2018) Taxonomy of traffic situations for the interaction between automated vehicles and human road users. In: Advances in human aspects of transportation AHFE 2017Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Jakob Reinhardt
    • 1
    Email author
  • Jonas Schmidtler
    • 1
  • Klaus Bengler
    • 1
  1. 1.Chair of ErgonomicsTechnical University of MunichGarchingGermany

Personalised recommendations