Advertisement

b-it-bots: Our Approach for Autonomous Robotics in Industrial Environments

  • Abhishek Padalkar
  • Mohammad Wasil
  • Shweta Mahajan
  • Ramesh Kumar
  • Dharmin Bakaraniya
  • Raghuvir Shirodkar
  • Heruka Andradi
  • Deepan Padmanabhan
  • Carlo Wiesse
  • Ahmed Abdelrahman
  • Sushant Chavan
  • Naresh Gurulingan
  • Deebul Nair
  • Santosh ThodukaEmail author
  • Iman Awaad
  • Sven Schneider
  • Paul G. Plöger
  • Gerhard K. Kraetzschmar
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11531)

Abstract

This paper presents the approach of our team, b-it-bots, in the RoboCup@Work competition which resulted in us winning the World Championship in Sydney in 2019. We describe our current hardware, including modifications made to the KUKA youBot, the underlying software framework and components developed for navigation, manipulation, perception and task planning for scenarios in industrial environments. Our combined 2D and 3D approach for object recognition has improved robustness and performance compared to previous years, and our task planning framework has moved us away from large state machines for high-level control. Future work includes closing the perception-manipulation loop for more robust grasping. Our open-source repository is available at https://github.com/b-it-bots/mas_industrial_robotics.

Notes

Acknowledgement

We value the advice and guidance over the years by Professor Gerhard Kraetzschmar, who sadly passed away this year. He was one of the founders of RoboCup@Work and actively involved in the team activities of b-it-bots. We gratefully acknowledge the continued support of the team by the b-it Bonn-Aachen International Center for Information Technology, Bonn-Rhein-Sieg University of Applied Sciences and AStA H-BRS.

References

  1. 1.
    MoveIt! motion planning. http://moveit.ros.org. Accessed 07 October 2019
  2. 2.
    RoboCup@Work Rulebook 2019. https://atwork.robocup.org/wp-content/uploads/2019/02/Rulebook_2019.pdf. Accessed 05 September 2019
  3. 3.
    Abdelrahman, A., Chavan, S., Tran, N.: Evaluation of New Planners. Unpublished manuscript. Hochschule Bonn-Rhein-Sieg (2019)Google Scholar
  4. 4.
    Ben-Shabat, Y., Lindenbaum, M., Fischer, A.: 3DmFV: three-dimensional point cloud classification in real-time using convolutional neural networks. IEEE Robot. Autom. Lett. 3(4), 3145–3152 (2018)CrossRefGoogle Scholar
  5. 5.
    Bischoff, R., et al.: BRICS - best practice in robotics. In: Proceedings of the IFR International Symposium on Robotics (ISR 2010), Munich, Germany, June 2010Google Scholar
  6. 6.
    Broecker, B., Claes, D., Fossel, J., Tuyls, K.: Winning the RoboCup@Work 2014 competition: the smARTLab approach. In: Bianchi, R.A.C., Akin, H.L., Ramamoorthy, S., Sugiura, K. (eds.) RoboCup 2014. LNCS (LNAI), vol. 8992, pp. 142–154. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-18615-3_12CrossRefGoogle Scholar
  7. 7.
    Carrion, O.L.: Task planning, execution and monitoring for mobile manipulators in industrial domains. Master’s thesis, Bonn-Rhein-Sieg University of Applied Sciences, Grantham Allee 20, 53757 St. Augustin, Germany, April 2016. https://github.com/oscar-lima/isr_planning/blob/kinetic/oscar_lima_master_thesis.pdf
  8. 8.
    Chiaverini, S., Siciliano, B., Egeland, O.: Review of the damped least-squares inverse kinematics with experiments on an industrial robot manipulator. IEEE Trans. Control Syst. Technol. 2(2), 123–134 (1994)CrossRefGoogle Scholar
  9. 9.
    Felzenszwalb, P.F., Huttenlocher, D.P.: Efficient graph-based image segmentation. Int. J. Comput. Vision 59(2), 167–181 (2004)CrossRefGoogle Scholar
  10. 10.
    Tim Field. SMACH documentation, July 2011. http://www.ros.org/wiki/smach/Documentation
  11. 11.
    Katz, M., Hoffmann, J.: Mercury planner: pushing the limits of partial delete relaxation. In: IPC 2014 Planner Abstracts, pp. 43–47 (2014)Google Scholar
  12. 12.
    Lima, O., Ventura, R., Awaad, I.: Integrating classical planning and real robots in industrial and service robotics domains. In: Proceedings of the 6th Workshop on Planning and Robotics (PlanRob) at ICAPS (2018)Google Scholar
  13. 13.
    Mallet, A., Pasteur, C., Herrb, M., Lemaignan, S., Ingrand, F.: GenoM3: building middleware-independent robotic components. In: 2010 IEEE International Conference on Robotics and Automation, pp. 4627–4632. IEEE (2010)Google Scholar
  14. 14.
    Mitrevski, A., Padalkar, A., Nguyen, M., Plöger, P.G.: “Lucy, take the noodle box!”: domestic object manipulation using movement primitives and whole body motion. In: Chalup, S., Niemueller, T., Suthakorn, J., Williams, M.-A. (eds.) RoboCup 2019. LNCS (LNAI), vol. 11531, pp. 189–200. Springer, Cham (2019)Google Scholar
  15. 15.
    Quigley, M., et al.: ROS: an open-source Robot Operating System. In: ICRA Workshop on Open Source Software (2009)Google Scholar
  16. 16.
    Richter, S., Westphal, M., Helmert, M.: LAMA 2008 and 2011. In: International Planning Competition, pp. 117–124 (2011)Google Scholar
  17. 17.
    Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., Chen, L.-C.: MobileNetV2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4510–4520 (2018)Google Scholar
  18. 18.
    Schaal, S.: Dynamic movement primitives - a framework for motor control in humans and humanoid robotics. In: Kimura, H., Tsuchiya, K., Ishiguro, A., Witte, H. (eds.) Adaptive Motion of Animals and Machines, pp. 261–280. Springer, Tokyo (2006).  https://doi.org/10.1007/4-431-31381-8_23CrossRefGoogle Scholar
  19. 19.
    Thoduka, S., Pazekha, S., Moriarty, A., Kraetzschmar, G.K.: RGB-D-based features for recognition of textureless objects. In: Behnke, S., Sheh, R., Sarıel, S., Lee, D.D. (eds.) RoboCup 2016. LNCS (LNAI), vol. 9776, pp. 294–305. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-68792-6_24CrossRefGoogle Scholar
  20. 20.
    Wu, B., Iandola, F., Jin, P.H., Keutzer, K.: SqueezeDet: unified, small, low power fully convolutional neural networks for real-time object detection for autonomous driving. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 129–137 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Abhishek Padalkar
    • 1
  • Mohammad Wasil
    • 1
  • Shweta Mahajan
    • 1
  • Ramesh Kumar
    • 1
  • Dharmin Bakaraniya
    • 1
  • Raghuvir Shirodkar
    • 1
  • Heruka Andradi
    • 1
  • Deepan Padmanabhan
    • 1
  • Carlo Wiesse
    • 1
  • Ahmed Abdelrahman
    • 1
  • Sushant Chavan
    • 1
  • Naresh Gurulingan
    • 1
  • Deebul Nair
    • 1
  • Santosh Thoduka
    • 1
    Email author
  • Iman Awaad
    • 1
  • Sven Schneider
    • 1
  • Paul G. Plöger
    • 1
  • Gerhard K. Kraetzschmar
    • 1
  1. 1.Department of Computer ScienceHochschule Bonn-Rhein-SiegSankt AugustinGermany

Personalised recommendations