Abstract
The rapid development of intelligent robotics would facilitate humans and robots will live and work together at a human workspace in the near future. It means research on effective human-robot interaction is essential for future robotics. The most common situation of human-robot interaction is that humans and robots work cooperatively, and robots should give proper assistance to humans for achieving a goal. In the workspace there are several objects including tools and a robot should identify the human intended objects or tools. There might be situational differences between a robot’s perspective and a human perspective because of several obstacles in environment. Thus, a robot needs to take the human perspective and simulates the situation from the human perspective to identify the human intended object. For human perspective taking, first of all a robot needs to check its own visibility for the environment. To address this challenge, this paper develops a 3D visibility check method by using a depth image in Webots. By using the developed method, a robot can determine whether each point in the environment is visible or invisible at its posture and detect objects if they are visible.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arkin, R.C., et al.: An ethological and emotional basis for human-robot interaction. Robotics and Autonomous Systems 42(3), 191–201 (2003)
Ros, R., et al.: Adaptive human-robot interaction in sensorimotor task instruction: From human to robot dance tutors. Robotics and Autonomous Systems 62(6), 707–720 (2014)
Severinson-Eklundh, K., et al.: Social and collaborative aspects of interaction with a service robot. Robotics and Autonomous Systems 42(3), 223–234 (2003)
Sorbello, R., et al.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human-humanoid interaction. Robotics and Autonomous Systems (2014), http://dx.doi.org/10.1016/j.robot.2014.03.017
Schmidt, P.A., et al.: A sensor for dynamic tactile information with applications in human-robot interaction and object exploration. Robotics and Autonomous Systems 54(12), 1005–1014 (2006)
Fritsch, J., et al.: Multi-modal anchoring for human-robot interaction. Robotics and Autonomous Systems 43(2), 133–147 (2003)
Cifuentes, C.A., et al.: Human-robot interaction based on wearable IMU sensor and laser range finder. Robotics and Autonomous Systems (2014), http://dx.doi.org/10.1016/j.robot.2014.06.001
Trafton, J.G., et al.: Enabling effective human-robot interaction using perspective-taking in robots. IEEE Trans. Systems, Man, and Cybernetics-Part A: Systems and Humans 35(4), 460–470 (2005)
Webots: robot simulator, http://www.cyberbotics.com
Henry, P., Krainin, M., Herbst, E., Ren, X., Fox, D.: RGB-D mapping: Using depth cameras for dense 3D modeling of indoor environments. In: Khatib, O., Kumar, V., Sukhatme, G. (eds.) Experimental Robotics. STAR, vol. 79, pp. 477–491. Springer, Heidelberg (2012)
Ramey, A., et al.: Integration of a low-cost RGB-D sensor in a social robot for gesture recognition. In: Proc. 6th International Conference on HRI, pp. 229–230 (2011)
Hu, G., et al.: A robust RGB-D slam algorithm. In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1714–1719 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Han, JH., Kim, JH. (2015). 3D Visibility Check in Webots for Human Perspective Taking in Human-Robot Interaction. In: Kim, JH., Yang, W., Jo, J., Sincak, P., Myung, H. (eds) Robot Intelligence Technology and Applications 3. Advances in Intelligent Systems and Computing, vol 345. Springer, Cham. https://doi.org/10.1007/978-3-319-16841-8_24
Download citation
DOI: https://doi.org/10.1007/978-3-319-16841-8_24
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-16840-1
Online ISBN: 978-3-319-16841-8
eBook Packages: EngineeringEngineering (R0)