Visual Positioning of Distant Wall-Climbing Robots Using Convolutional Neural Networks

Abstract

Detection of visual markers, such as circular markers or quick response codes, is a commonly used approach to the positioning of wall-climbing robots. However, when the camera is far from the wall-climbing robot (e.g., 20 m), these markers become extremely blurred and difficult to detect. In this paper, a convolutional neural network-based positioning scheme comprised of a global bounding box detector and local wheel detector is proposed. The light-weight local wheel detector can quickly and accurately detect the four wheel points of a distant wall-climbing robot, and the detected wheel points can be used for calculating its position and direction angle. Our wheel detector has a single-frame processing time of 72.2 ms on a CPU and 7.1 ms on a GPU, where the latter meets the real-time positioning requirements of the wall-climbing robot. We also developed an efficient cost function for wheel matching between video frames. Simulation results and multiple test videos confirmed that the proposed cost function can match wheels between video frames perfectly. The high performance of this positioning system indicates that it may be used in a variety of industrial applications.

This is a preview of subscription content, log in to check access.

References

  1. 1.

    Akinfiev, T., Armada, M., Nabulsi, S.: Climbing cleaning robot forvertical surfaces. Industrial Robot: an International Journal 36(4), 352–357 (2009). https://doi.org/10.1108/01439910910957110

    Article  Google Scholar 

  2. 2.

    Armada, M., Prieto, M., Akinfiev, T.: On the design and development of climbing and walking robots for the maritime industries. Journal of maritime research: JMR 2(1), 9–31 (2005)

    Google Scholar 

  3. 3.

    Cao, Y., Li, M., Svogor, I., Wei, S., Beltrame, G.: Dynamic range-only localization for multi-robot systems. IEEE Access 6, 46527–46537 (2018). https://doi.org/10.1109/access.2018.2866259

    Article  Google Scholar 

  4. 4.

    Cao, Z., Simon, T., Wei, S.-E., Sheikh, Y.: Realtime multi-person 2d pose estimation using part affinity fields (2016)

  5. 5.

    Everingham, M., Van Gool, L., Williams, C.K.I., Winn, J., Zisserman, A.: The pascal visual object classes (voc) challenge. Int. J. Comput. Vis. 88(2), 303–338 (2010)

    Article  Google Scholar 

  6. 6.

    Go, T., Osawa, T., Nakamura, T.: Proposed locomotion strategy for a traveling-wave-type omnidirectional wall-climbing robot for spherical surfaces. In: 2015 IEEE International Conference on Robotics And Biomimetics (ROBIO). https://doi.org/10.1109/robio.2015.7419104. IEEE (2015)

  7. 7.

    González, J., Blanco, J.L., Galindo, C., Ortiz de Galisteo, A., Fernández-Madrigal, J.A., Moreno, F.A., Martínez J.L.: Mobile robot localization based on ultra-wide-band ranging: a particle filter approach. Robot. Auton. Syst. 57(5), 496–507 (2009). https://doi.org/10.1016/j.robot.2008.10.022

    Article  Google Scholar 

  8. 8.

    Han, S.-B., Kim, J.-H., Myung, H.: Landmark-based particle localization algorithm for mobile robots with a fish-eye vision system. IEEE/ASME Trans. Mechatron. 18(6), 1745–1756 (2013). https://doi.org/10.1109/tmech.2012.2213263

    Article  Google Scholar 

  9. 9.

    Huang, H., Li, D., Xue, Z., Chen, X.L., Liu, S., Leng, J., Wei, Y.: Design and performance analysis of a tracked wall-climbing robot for ship inspection in shipbuilding. Ocean Eng. 131, 224–230 (2017). https://doi.org/10.1016/j.oceaneng.2017.01.003

    Article  Google Scholar 

  10. 10.

    Kim, T.Y., Kim, J.H, Seo, K.C., Kim, H.M., Lee, G.U., Kim, J.W., Kim, H.S.: Design and control of a cleaning unit for a novel wall-climbing robot. Appl. Mech. Mater. 541-542, 1092–1096 (2014). https://doi.org/10.4028/www.scientific.net/amm.541-542.1092

    Article  Google Scholar 

  11. 11.

    Lin, T.-Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollár, P., Zitnick, C.L.: Microsoft COCO: Common objects in context. In: Computer vision – ECCV 2014, pp 740–755. Springer (2014)

  12. 12.

    Lin, T.-Y., Dollár, P., Girshick, R., He, K., Hariharan, B., Belongie, S.: Feature pyramid networks for object detection (2016)

  13. 13.

    Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., Berg, A.C.: Ssd: single shot multibox detector (2015)

  14. 14.

    Moniri, M.M., Bamdad, M., Hajizadeh, A.: A novel mechatronic design of wall climbing robot for steel storage tank inspection. In: 2015 16th International Conference on Research And Education in Mechatronics (REM). https://doi.org/10.1109/rem.2015.7380365. IEEE (2015)

  15. 15.

    Nazemzadeh, P., Fontanelli, D., Macii, D., Palopoli, L.: Indoor localization of mobile robots through QR code detection and dead reckoning data fusion. IEEE/ASME Trans. Mechatron. 22(6), 2588–2599 (2017). https://doi.org/10.1109/tmech.2017.2762598

    Article  Google Scholar 

  16. 16.

    Salari, S., Shahbazpanahi, S., Ozdemir, K.: Mobility-aided wireless sensor network localization via semi definite programming. IEEE Trans. Wirel. Commun. 12(12), 5966–5978 (2013). https://doi.org/10.1109/twc.2013.110813.120379

    Article  Google Scholar 

  17. 17.

    San-Millan, A.: Design of a teleoperated wall climbing robot for oil tank inspection. In: 2015 23rd Mediterranean Conference on Control and Automation (MED). https://doi.org/10.1109/med.2015.7158759. IEEE (2015)

  18. 18.

    Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., Chen, L.-C.: Mobilenetv2: inverted residuals and linear bottlenecks (2018)

  19. 19.

    Wang, C., Fu, Z.: A new way to detect the position and orientation of the wheeled mobile robot on the image plane. In: 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014). https://doi.org/10.1109/robio.2014.7090656. IEEE (2014)

  20. 20.

    Zhou, Q., Li, X.: Experimental comparison of drag-wiper and roller-wiper glass-cleaning robots. Industrial Robot: an International Journal 43(4), 409–420 (2016). https://doi.org/10.1108/ir-01-2016-0020

    MathSciNet  Article  Google Scholar 

  21. 21.

    Zhou, Q., Li, X.: Experimental investigation on climbing robot using rotation-flow adsorption unit. Robot. Auton. Syst. 105, 112–120 (2018). https://doi.org/10.1016/j.robot.2018.03.008

    Article  Google Scholar 

  22. 22.

    Zhou, Q., Li, X.: Convolutional network-based method for wall-climbing robot direction angle measurement. Industrial Robot: An International Journal In press (2019)

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Nos. U1613203 and 51975514), Shenzhen Science and Technology Plan (No. JCYJ20170816172938761), and the Fundamental Research Funds for the Central Universities (No. 51221004).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Xin Li.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhou, Q., Li, X. Visual Positioning of Distant Wall-Climbing Robots Using Convolutional Neural Networks. J Intell Robot Syst 98, 603–613 (2020). https://doi.org/10.1007/s10846-019-01096-w

Download citation

Keywords

  • Wall-climbing robot
  • Localization
  • CNN