Skip to main content

Learning Hand-Eye Coordination for Robotic Grasping with Large-Scale Data Collection

  • Conference paper
  • First Online:

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 1))

Abstract

We describe a learning-based approach to hand-eye coordination for robotic grasping from monocular images. To learn hand-eye coordination for grasping, we trained a large convolutional neural network to predict the probability that task-space motion of the gripper will result in successful grasps, using only monocular camera images and independently of camera calibration or the current robot pose. This requires the network to observe the spatial relationship between the gripper and objects in the scene, thus learning hand-eye coordination. We then use this network to servo the gripper in real time to achieve successful grasps. To train our network, we collected over 800,000 grasp attempts over the course of two months, using between 6 and 14 robotic manipulators at any given time, with differences in camera placement and hardware. Our experimental evaluation demonstrates that our method achieves effective real-time control, can successfully grasp novel objects, and corrects mistakes by continuous servoing.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    An extended version of this paper is available online [Levine et al. 2016].

  2. 2.

    In this work, we only consider vertical pinch grasps, though extensions to other grasp parameterizations would be straightforward.

References

  • Bohg, J., Morales, A., Asfour, T., Kragic, D.: Data-driven grasp synthesis a survey. IEEE Trans. Robot. 30(2), 289–309 (2014)

    Article  Google Scholar 

  • Goldfeder, C., Ciocarlie, M., Dang, H., Allen, P.K.: The Columbia grasp database. In: IEEE International Conference on Robotics and Automation (2009)

    Google Scholar 

  • Hebert, P., Hudson, N., Ma, J., Howard, T., Fuchs, T., Bajracharya, M., Burdick, J.: Combined shape, appearance and silhouette for simultaneous manipulator and object tracking. In: IEEE International Conference on Robotics and Automation. IEEE (2012)

    Google Scholar 

  • Herzog, A., Pastor, P., Kalakrishnan, M., Righetti, L., Bohg, J., Asfour, T., Schaal, S.: Learning of grasp selection based on shape-templates. Autonom. Robots 36(1–2), 51–65 (2014)

    Article  Google Scholar 

  • Hudson, N., Howard, T., Ma, J., Jain, A., Bajracharya, M., Myint, S., Kuo, C., Matthies, L., Backes, P., Hebert, P.: End-to-end Dexterous manipulation with deliberate interactive estimation. In: IEEE International Conference on Robotics and Automation (2012)

    Google Scholar 

  • Kappler, D., Bohg, B., Schaal, S.: Leveraging big data for grasp planning. In: IEEE International Conference on Robotics and Automation (2015)

    Google Scholar 

  • Kragic, D., Christensen, H.I.: Survey on visual servoing for manipulation. Computational Vision and Active Perception Laboratory 15 (2002)

    Google Scholar 

  • Leeper, A., Hsiao, K., Chu, E., Salisbury, J.K.: Using near-field stereo vision for robotic grasping in cluttered environments. In: Khatib, O., Kumar, V., Sukhatme, G. (eds.) Experimental Robotics. STAR, vol. 79, pp. 253–267. Springer, Heidelberg (2014)

    Chapter  Google Scholar 

  • Lenz, I., Lee, H., Saxena, A.: Deep learning for detecting robotic grasps. Int. J. Robot. Res. 34(4–5), 705–724 (2015)

    Article  Google Scholar 

  • Levine, S., Pastor, P., Krizhevsky, A., Quillen, D.: Learning hand-eye coordination for robotic grasping with deep learning, large-scale data collection. arXiv preprint (2016). arXiv:1603.02199

  • Pinto, L., Gupta, A.: Supersizing self-supervision: learning to grasp from 50 k tries and 700 robot hours. In: IEEE International Conference on Robotics and Automation (2016)

    Google Scholar 

  • Redmon, J., Angelova, A.: Real-time grasp detection using convolutional neural networks. In: IEEE International Conference on Robotics and Automation (2015)

    Google Scholar 

  • Rubinstein, R., Kroese, D.: The Cross-Entropy Method. Springer, New York (2004)

    Book  MATH  Google Scholar 

  • Siciliano, B., Khatib, O.: Springer Handbook of Robotics. Springer, Secaucus (2007)

    MATH  Google Scholar 

  • Vahrenkamp, N., Wieland, S., Azad, P., Gonzalez, D., Asfour, T., Dillmann, R.: Visual servoing for humanoid grasping and manipulation tasks. In: 8th IEEE-RAS International Conference on Humanoid Robots (2008)

    Google Scholar 

Download references

Acknowledgements

We thank Kurt Konolige and Mrinal Kalakrishnan for additional engineering and discussions, Jed Hewitt, Don Jordan, and Aaron Weiss for help with hardware, Max Bajracharya and Nicolas Hudson for the baseline perception pipeline, and Vincent Vanhoucke and Jeff Dean for support and organization.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sergey Levine .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Levine, S., Pastor, P., Krizhevsky, A., Quillen, D. (2017). Learning Hand-Eye Coordination for Robotic Grasping with Large-Scale Data Collection. In: Kulić, D., Nakamura, Y., Khatib, O., Venture, G. (eds) 2016 International Symposium on Experimental Robotics. ISER 2016. Springer Proceedings in Advanced Robotics, vol 1. Springer, Cham. https://doi.org/10.1007/978-3-319-50115-4_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-50115-4_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-50114-7

  • Online ISBN: 978-3-319-50115-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics