Skip to main content
Log in

Experimental evaluation of a real-time GPU-based pose estimation system for autonomous landing of rotary wings UAVs

  • Published:
Control Theory and Technology Aims and scope Submit manuscript

Abstract

This paper proposes a real-time system for pose estimation of an unmanned aerial vehicle (UAV) using parallel image processing and a fiducial marker. The system exploits the capabilities of a high-performance CPU/GPU embedded system in order to provide on-board high-frequency pose estimation enabling autonomous takeoff and landing. The system is evaluated extensively with lab and field tests using a custom quadrotor. The autonomous landing is successfully demonstrated, through experimental tests, using the proposed algorithm. The results show that the system is able to provide precise pose estimation with a framerate of at least 30 fps and an image resolution of 640×480 pixels. The main advantage of the proposed approach is in the use of the GPU for image filtering and marker detection. The GPU provides an upper bound on the required computation time regardless of the complexity of the image thereby allowing for robust marker detection even in cluttered environments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. C. Bu, Y. Ai, H. Du. Vision-based autonomous landing for rotorcraft unmanned aerial vehicle. IEEE International Conference on Vehicular Electronics and Safety (ICVES), Beijing: IEEE, 2016: 1–6. DOI 10.1109/ICVES.2016.7548174.

    Google Scholar 

  2. A. Gautam, P. B. Sujit, S. Saripalli. A survey of autonomous landing techniques for UAVs. International Conference on Unmanned Aircraft Systems (ICUAS), Orlando: IEEE, 2014: 1210–1218. DOI 10.1109/ICUAS.2014.6842377.

    Google Scholar 

  3. M. F. R. Lee, S. F. Su, J. W. E. Yeah, et al. Autonomous landing system for aerial mobile robot cooperation. The Joint 7th International Conference on Soft Computing and Intelligent Systems (SCIS) and 15th International Symposium on Advanced Intelligent Systems (ISIS), Kitakyushu: IEEE, 2014: 1306–1311. DOI 10.1109/SCIS-ISIS.2014.7044826.

    Google Scholar 

  4. X. Guan, H. Bai. A GPU accelerated real-time self-contained visual navigation system for UAVs. IEEE International Conference on Information and Automation, Shenyang: IEEE, 2012: 578–581. DOI 10.1109/ICInfA.2012.6246879.

    Google Scholar 

  5. S. Yang, S. A. Scherer, K. Schauwecker, et al. Onboard monocular vision for landing of an MAV on a landing site specified by a single reference image. International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta: IEEE, 2013: 318–325. DOI 10.1109/ICUAS.2013.6564704.

    Chapter  Google Scholar 

  6. S. Yang, S. A. Scherer, K. Schauwecker, A. Zell. Autonomous Landing of MAVs on an arbitrarily textured landing site using onboard monocular vision. Journal of Intelligent & Robotic Systems, 2014, 74(1/2): 27–43.

    Article  Google Scholar 

  7. G. Klein, D. Murray. Parallel tracking and mapping for small AR workspaces. The 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara: IEEE, 2007: 225–234. DOI 10.1109/ISMAR.2007.4538852.

    Google Scholar 

  8. F. Cocchioni, A. Mancini, S. Longhi. Autonomous navigation, landing and recharge of a quadrotor using artificial vision. International Conference on Unmanned Aircraft Systems (ICUAS), Orlando: IEEE, 2014: 418–429. DOI 10.1109/ICUAS.2014. 6842282.

    Google Scholar 

  9. Y. Jung, D. Lee, H. Bang. Study on ellipse fitting problem for vision-based autonomous landing of an UAV. The 14th International Conference on Control, Automation and Systems (ICCAS), Seoul: IEEE, 2014: 1631–1634. DOI 10.1109/ICCAS.2014.6987819

    Google Scholar 

  10. K. Li, P. Liu, T. Pang, et al. Development of an unmanned aerial vehicle for rooftop landing and surveillance. International Conference on Unmanned Aircraft Systems (ICUAS), Denver: IEEE, 2015: 832–838. DOI 10.1109/ICUAS.2015.7152368.

    Chapter  Google Scholar 

  11. A. Masselli, S. Yang, K. E. Wenzel, et al. A cross-platform comparison of visual marker based approaches for autonomous flight of quadrocopters. International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta: IEEE, 2013: 685–693. DOI 10.1109/ICUAS.2013.6564749.

    Chapter  Google Scholar 

  12. W. Roozing, A. H. Goktogan. Low-cost vision-based 6-DOF MAV localization using IR beacons. IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Wollongong: IEEE, 2013: 1003–1009. DOI 10.1109/AIM.2013.6584225.

    Chapter  Google Scholar 

  13. H. Cheng, Y. Chen, X. Li, et al. Autonomous takeoff, tracking and landing of a UAV on a moving UGV using onboard monocular vision. Proceedings of the 32nd Chinese Control Conference, Xi’an: IEEE, 2013: 5895–5901.

    Google Scholar 

  14. S. Lange, N. Sunderhauf, P. Protzel. A vision based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments. International Conference on Advanced Robotics, Munich, 2009: 1–6.

    Google Scholar 

  15. K. H. Hsia, S. F. Lien, J. P. Su. Height estimation via stereo vision system for unmanned helicopter autonomous landing. International Symposium on Computer, Communication, Control and Automation (3CA), Tainan: IEEE, 2010: 257–260. DOI 10.1109/3CA.2010.5533535.

    Google Scholar 

  16. S. Saripalli, G. S. Sukhatme. Landing on a moving target using an autonomous helicopter. Field and Service Robotics: Recent Advances in Research and Applications. Berlin: Springer, 2006: 277–286. DOI 10.1007/10991459_27.

    Chapter  Google Scholar 

  17. D. Jeon, D.-H. Kim, Y.-G. Ha, et al. Image processing acceleration for intelligent unmanned aerial vehicle on mobile GPU. Soft Computing, 2016, 20(5): 1713–1720. DOI http://dx.doi.org/10.1007/s00500-015-1656-y.

    Article  Google Scholar 

  18. F. Ababsa, M. Mallem. A robust circular fiducial detection technique and real-time 3D camera tracking. Journal of Multimedia, 2008, 3(4): 34–41

    Article  Google Scholar 

  19. L. Calvet, P. Gurdjos, V. Charvillat. Camera tracking using concentric circle markers: Paradigms and algorithms. The 19th IEEE International Conference on Image Processing, Orlando: IEEE, 2012: 1361–1364. DOI 10.1109/ICIP.2012.6467121.

    Google Scholar 

  20. F. Ababsa, M. Mallem. A robust circular fiducial detection technique and real-time 3D camera tracking. Journal of Multimedia, 2008, 3(4): 34–41.

    Article  Google Scholar 

  21. A. Benini, M. J. Rutherford, K. P. Valavanis. Real-time, GPUbased pose estimation of a UAV for autonomous takeoff and landing. IEEE International Conference on Robotics and Automation (ICRA), Stockholm: IEEE, 2016: 3463–3470. DOI 10.1109/ICRA.2016.7487525.

    Google Scholar 

  22. S. A. Conyers, N. I. Vitzilaios, M. J. Rutherford, et al. A mobile self-leveling landing platform for VTOL UAVs. IEEE International Conference on Robotics and Automation (ICRA), Seattle: IEEE, 2015: 815–822. DOI 10.1109/ICRA.2015.7139272.

    Google Scholar 

  23. CUDA Programming Guide: http://docs.nvidia.com/cuda/cuda-cprogramming-guide.

  24. Pinhole Camera Model: https://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alessandro Benini.

Additional information

This work was partially supported by the National Science Foundation (NSF) under the Computer and Network Systems (CNS) award (Nos. 1229236, 1446285).

Alessandro BENINI is a Research Scientist at the University of Denver (DU) Unmanned Systems Research Institute (DU2SRI). He received his Ph.D. degree in Automation Engineering at the Universita Politecnica delle Marche, Ancona, Italy. His research interests include the development of cooperating autonomous mobile systems (UAV and UGV) with high safety performance and reliability, simulation and control of dynamic systems, computer vision and sensor fusion. Dr. Alessandro Benini is IEEE Member.

Matthew J. RUTHERFORD is an Associate Professor in the Department of Computer Science with a joint appointment in the Department of Electrical and Computer Engineering. He is also Deputy Director of the DU Unmanned Systems Research Institute (DU2SRI). His research portfolio includes: the development of advanced controls and communication mechanisms for autonomous aerial and ground robots; applications of real-time computer vision to robotics problems using GPU-based parallel processing; testing and dynamic evaluation of embedded, real-time systems; development of complex mechatronic systems (mechanical, electrical, and software); the development of software techniques to reduce the amount of energy being consumed by hardware; and development of a high-precision propulsion system for underwater robots.

Kimon P. VALAVANIS is John Evans Professor and Chair, Department of Electrical and Computer Engineering, with a joint appointment in the Department of Computer Science. He is also Director of the DU Unmanned Systems Research Institute (DU2SRI). His research interests are in the areas of intelligent control, robotics and automation, and distributed intelligent systems. He has more than 400 total referred publications including 16 books and edited books. He is Fellow of the American Association for the Advancement of Science (AAAS), Fellow of the U.K. Institute of Measurement and Control, Senior Member of IEEE, Editor-in-Chief of the Journal of Intelligent and Robotic Systems (Springer), and Fulbright Scholar.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Benini, A., Rutherford, M.J. & Valavanis, K.P. Experimental evaluation of a real-time GPU-based pose estimation system for autonomous landing of rotary wings UAVs. Control Theory Technol. 16, 145–159 (2018). https://doi.org/10.1007/s11768-018-7297-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11768-018-7297-9

Keywords

Navigation