Advertisement

Journal of Intelligent & Robotic Systems

, Volume 69, Issue 1–4, pp 459–473 | Cite as

Omnidirectional Vision for UAV: Applications to Attitude, Motion and Altitude Estimation for Day and Night Conditions

  • Ashutosh Natraj
  • Dieu Sang Ly
  • Damien Eynard
  • Cédric Demonceaux
  • Pascal Vasseur
Article

Abstract

This paper presents the combined applications of omnidirectional vision featuring on its application to aerial robotics. Omnidirectional vision is first used to compute the attitude, altitude and motion not only in rural environment but also in the urban space. Secondly, a combination of omnidirectional and perspective cameras permits to estimate the altitude. Finally we present a stereo system consisting of an omnidirectional camera with a laser pattern projector enables to calculate the altitude and attitude during the improperly illuminated conditions to dark environments. We demonstrate that omnidirectional camera in conjunction with other sensors is suitable choice for UAV applications not only in different operating environments but also in various illumination conditions.

Keywords

UAV Omnivision Computer vision Altitude attitude motion estimation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bazin, J.C., Demonceaux, C., Vasseur, P.: Fast central catadioptric line extraction. In: 3rd Iberian Conference on Pattern Recognition and Image Analysis (IbPRIA’07), pp. 25–32. Girona, Spain (2007)Google Scholar
  2. 2.
    Cabecinhas, D., Naldi, R., Marconi, L., Silvestre, C., Cunha, R.: Robust take-off and landing for a quadrotor vehicle. In: 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 1630–1635 (2010)Google Scholar
  3. 3.
    Cheriany, A., Andersh, J., Morellas, V., Papanikolopoulos, N., Mettler, B.: Autonomous altitude estimation of a UAV using a single onboard camera. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS’09, pp. 3900–3905. Piscataway, NJ (2009)Google Scholar
  4. 4.
    Collins, R.: A space-sweep approach to true multi-image matching. In: IEEE Computer Vision and Pattern Recognition, pp. 358–363 (1996)Google Scholar
  5. 5.
    Cornall, T., Egan, G., Price, A.: Aircraft attitude estimation from horizon video. Electron. Lett. 42(13), 744–745 (2006)CrossRefGoogle Scholar
  6. 6.
    Demonceaux, C., Vasseur, P.: Markov random fields for catdioptric image processing. Pattern Recogn. Lett. 27, 1957–1967 (2006)CrossRefGoogle Scholar
  7. 7.
    Demonceaux, C., Vasseur, P., Pégard, C.: Robust attitude estimation with catadioptric vision. In: IEEE/RSJ International Conference on Intelligent Robots and Systems 2006 (IROS’06), pp. 3448–3453. IEEE, Beijing, China (2006)Google Scholar
  8. 8.
    Demonceaux, C., Vasseur, P., Pègard, C.: Omnidirectional vision on UAV for attitude computation. In: IEEE International Conference on Robotics and Automation 2006 (ICRA’06), pp. 2842–2847. IEEE, Orlando, FL (2006)Google Scholar
  9. 9.
    Demonceaux, C., Vasseur, P., Pègard, C.: UAV attitude computation by omnidirectional vision in urban environment. In: IEEE International Conference on Robotics and Automation 2007 (ICRA’07), pp. 2017–2022. IEEE, Roma, Italy (2007)Google Scholar
  10. 10.
    Ettinger, M., Nechyba, S.M., Ifju, P., Waszak, M.: Vision-guided flight stability and control for micro air vehicles. Adv. Robot. 17(7), 617–640 (2003)CrossRefGoogle Scholar
  11. 11.
    Eynard, D., Vasseur, P., Demonceaux, C., Fremont, V.: UAV altitude estimation by mixed stereoscopic vision. In: IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS’10. Taipei, Taiwan (2010)Google Scholar
  12. 12.
    How, J., Bethke, B., Frank, A., Dale, D., Vian, J.: Real-time indoor autonomous vehicle test environment. IEEE Control Syst. 28(2), 51–64 (2008)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Ly, S., Demonceaux, C., Vasseur, P.: Translation estimation for single viewpoint cameras using lines. In: ICRA, pp. 1928–1933 (2010)Google Scholar
  14. 14.
    Meingast, M., Geyer, C., Sastry, S.: A space-sweep approach to true multi-image matching. In: IEEE Conference on Decision and Control, pp. 1670–1675 (2004)Google Scholar
  15. 15.
    Merino, L., Caballero, F., Forssen, P., Wiklund, J., Ferruz, J., Martihez-de Dios, J.R., Moe, A., Nordberg, K., Ollero, A.: Single and multi-UAV relative position estimation based on natural landmarks. In: Valavanis, K.P. (ed.) Advances in Unmanned Aerial Vehicles. Intelligent Systems, Control and Automation: Science and Engineering, vol. 33, pp. 267–307. Springer, The Netherlands (2007). doi: 10.1007/978-1-4020-6114
  16. 16.
    Mondragón, I.F., Campoy, P., Martinez, C., Olivares, M.: Omnidirectional vision applied to unmanned aerial vehicles (UAVs) attitude and heading estimation. Robot. Auton. Syst. 58, 809–819 (2010)CrossRefGoogle Scholar
  17. 17.
    Natraj, A., Demonceaux, C., Vasseur, P., Sturm, P.: Vision based attitude and altitude estimation for UAVs in dark environments. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4006–4011 (2011)Google Scholar
  18. 18.
    Rudol, P., Wzorek, M., Doherty, P.: Vision-based pose estimation for autonomous indoor navigation of micro-scale unmanned aircraft systems. In: 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 1913–1920 (2010)Google Scholar
  19. 19.
    Ruffier, F., Franceschini, N.: Optic flow regulation: the key to aircraft automatic guidance. Robot. Auton. Syst. 50(4), 177–194 (2005)CrossRefGoogle Scholar
  20. 20.
    Saripalli, S., Sukhatme, G.: Landing a helicopter on a moving target. In: 2007 IEEE International Conference on Robotics and Automation, pp. 2030–2035 (2007)Google Scholar
  21. 21.
    Shakernia, O., Ma, Y., Koo, T.J., John, T., Sastry, S.: Landing an unmanned air vehicle: Vision based motion estimation and nonlinear. Asian J. Control 1, 128–145 (1999)CrossRefGoogle Scholar
  22. 22.
    Wagter, C.D., Mulder, J.: Towards vision-based UAV situation awarenes. In: 2005 AIAA Guidance, Navigation, and Control Conference and Exhibit (2005)Google Scholar
  23. 23.
    Wang, L., Hsieh, S., Hsueh, E., Hsaio, F., Hunag, K.: Complete pose determination for low altitude unmanned aerial vehicle using stereo vision. In: Proc. of Int. Conf. on Intelligent Robots and Systems (IROS’05), pp. 316–321. Edmonton, Canada (2005)Google Scholar
  24. 24.
    Woo, J., Kweon, I.-S., Kim, G.S., Kim, I.-C.: Robust horizon and peak extraction for vision-based navigation. In: Proc. Machine Vision Applications, pp. 526–529 (2005)Google Scholar
  25. 25.
    Xu, G., Zhang, Y., Ji, S., Cheng, Y., Tian, Y.: Research on computer vision-based for UAV autonomous landing on a ship. Pattern Recogn. Lett. 30(6), 600–605 (2009)CrossRefGoogle Scholar
  26. 26.
    Ying, X., Hu, Z.: Can we consider central catadioptric cameras and fisheye cameras within a unified imaging model. In: European Conference on Computer Vision 2004 “ECCV04”, vol. I, pp. 442–455 (2004)Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2012

Authors and Affiliations

  • Ashutosh Natraj
    • 1
    • 4
  • Dieu Sang Ly
    • 2
  • Damien Eynard
    • 3
  • Cédric Demonceaux
    • 2
  • Pascal Vasseur
    • 3
  1. 1.Université de Picardie Jules VerneAmiensFrance
  2. 2.Laboratory LE2IUniversité de BourgogneBourgogneFrance
  3. 3.Laboratory LITISUniversité de RouenRouenFrance
  4. 4.Laboratory LE2IUniversité de BourgogneBourgogneFrance

Personalised recommendations