Abstract
We implemented a gaze algorithm for interacting with multiple observers as a precursor to a multi-party conversation system. By acknowledging multiple participants in a natural manner, we seek to set the stage for smoother and more effective human-robot conversations featuring proper turn-taking using attention shifts. The android robot EveR-4, developed at the Korea Institute of Industrial Technology for human-robot interaction (HRI) applications was used. The robot wore a dress and was made up to replicate interacting naturally with a real woman as much as possible. Using a RGB-D camera, peoples faces and positions were tracked so that the robot’s attention could be given to everyone appropriately. An importance value was assigned to each detected face based on the length of time it was detected and its distance to the robot. Facial expressions were made by the robot when people were seen to increase observers’ sense of interaction. We observed peoples reactions to our implementation at an exhibition and made note of how we can improve the overall system to be more life-like and realistic.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Bennewitz, M., Faber, F., Joho, D., Schreiber, M., Behnke, S.: Towards a humanoid museum guide robot that interacts with multiple persons. In: 2005 5th IEEE-RAS International Conference on Humanoid Robots, pp. 418–423. IEEE (2005)
Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 11, 1254–1259 (1998)
Johansson, M., Skantze, G., Gustafson, J.: Comparison of human-human and human-robot turn-taking behaviour in multiparty situated interaction. In: Proceedings of the 2014 Workshop on Understanding and Modeling Multiparty, Multimodal Interactions, pp. 21–26. ACM (2014)
King, D.E.: Dlib-ml: A machine learning toolkit. J. Mach. Learn. Res. 10, 1755–1758 (2009)
Mutlu, B., Kanda, T., Forlizzi, J., Hodgins, J., Ishiguro, H.: Conversational gaze mechanisms for humanlike robots. ACM Trans. Interact. Intell. Syst. (TiiS) 1(2), 12 (2012)
Srinivasan, V., Murphy, R.R.: A survey of social gaze. In: 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 253–254. IEEE (2011)
Vertegaal, R., Slagter, R., Van der Veer, G., Nijholt, A.: Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 301–308. ACM (2001)
Zaraki, A., Mazzei, D., Giuliani, M., De Rossi, D.: Designing and evaluating a social gaze-control system for a humanoid robot. Hum. Mach. Syst. IEEE Trans. 44(2), 157–168 (2014)
Acknowlegement
This work was supported by the Robot R&D Program (10041659) funded by the Ministry of Trade, Industry and Energy (MOTIE, South Korea).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Cruz, B.D., Ahn, BK., Hyung, HJ., Lee, DW. (2016). Developing an Interactive Gaze Algorithm for Android Robots. In: Agah, A., Cabibihan, JJ., Howard, A., Salichs, M., He, H. (eds) Social Robotics. ICSR 2016. Lecture Notes in Computer Science(), vol 9979. Springer, Cham. https://doi.org/10.1007/978-3-319-47437-3_43
Download citation
DOI: https://doi.org/10.1007/978-3-319-47437-3_43
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-47436-6
Online ISBN: 978-3-319-47437-3
eBook Packages: Computer ScienceComputer Science (R0)