Advertisement

Pointing Estimation for Human-Robot Interaction Using Hand Pose, Verbal Cues, and Confidence Heuristics

  • Andrew Showers
  • Mei Si
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10914)

Abstract

People utilize pointing directives frequently and effortlessly. Robots, therefore, will need to interpret these directives in order to understand the intention of the user. This is not a trivial task as the intended pointing direction rarely aligns with the ground truth pointing vector. Standard methods require head, arm, and hand pose estimation inhibiting more complex pointing gestures that can be found in human-human interactions. In this work, we aim to interpret these pointing directives by using the pose of the index finger in order to capture both simple and complex gestures. Furthermore, this method can act as a fall-back for when full-body pose information is not available. This paper demonstrates the ability of a robot to determine pointing direction using data collected from a Microsoft Kinect camera. The finger joints are detected in 3D-space and used in conjugation with verbal cues from the user to determine the point of interest (POI). In addition to this, confidence heuristics are provided to determine the quality of the source information, whether verbal or physical. We evaluated the performance of using these features with a support vector machine, decision tree, and a generalized model which does not rely on a learning algorithm.

Keywords

Pointing Object detection Object localization Social interaction 

References

  1. 1.
    Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., Berg, A.C.: SSD: single shot multibox detector. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9905, pp. 21–37. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46448-0_2CrossRefGoogle Scholar
  2. 2.
    Abidi, S., Williams, M., Johnston, B.: Human pointing as a robot directive. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, pp. 67–68 (2013)Google Scholar
  3. 3.
    Droeschel, D., Stckler, J., Behnke, S.: Learning to interpret pointing gestures with a time-of-flight camera. In: 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Lausanne, pp. 481–488 (2011)Google Scholar
  4. 4.
    Li, Z., Jarvis, R.: Visual interpretation of natural pointing gestures in 3D space for human-robot interaction. In: 2010 11th International Conference on Control Automation Robotics & Vision, Singapore, pp. 2513–2518 (2010)Google Scholar
  5. 5.
    Ueno, S., Naito, S., Chen, T.: An efficient method for human pointing estimation for robot interaction. In: 2014 IEEE International Conference on Image Processing (ICIP), Paris, pp. 1545–1549 (2014)Google Scholar
  6. 6.
    Shukla, D., Erkent, O., Piater, J.: Probabilistic detection of pointing directions for human-robot interaction. In: 2015 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Adelaide, SA, pp. 1–8 (2015)Google Scholar
  7. 7.
    Pateraki, M., Baltzakis, H., Trahanias, P.: Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation. In: 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, pp. 1060–1067 (2011)Google Scholar
  8. 8.
    Stiefelhagen, R., et al.: Enabling multimodal humanrobot interaction for the Karlsruhe humanoid robot. IEEE Trans. Rob. 23(5), 840–851 (2007)CrossRefGoogle Scholar
  9. 9.
    Burger, B., Ferran, I., Lerasle, F., Infantes, G.: Two-handed gesture recognition and fusion with speech to command a robot. Auton. Robots 32(2), 129–147 (2012)CrossRefGoogle Scholar
  10. 10.
    Yamamoto, Y., Yoda, I., Sakaue, K.: Arm-pointing gesture interface using surrounded stereo cameras system. In: Proceedings of the 17th International Conference on Pattern Recognition, ICPR 2004, vol. 4, pp. 965–970 (2004)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Rensselaer Polytechnic InstituteTroyUSA

Personalised recommendations