Probabilistic Hough Voting for Attitude Estimation from Aerial Fisheye Images

  • Bertil Grelsson
  • Michael Felsberg
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7944)


For navigation of unmanned aerial vehicles (UAVs), attitude estimation is essential. We present a method for attitude estimation (pitch and roll angle) from aerial fisheye images through horizon detection. The method is based on edge detection and a probabilistic Hough voting scheme. In a flight scenario, there is often some prior knowledge of the vehicle altitude and attitude. We exploit this prior to make the attitude estimation more robust by letting the edge pixel votes be weighted based on the probability distributions for the altitude and pitch and roll angles. The method does not require any sky/ground segmentation as most horizon detection methods do. Our method has been evaluated on aerial fisheye images from the internet. The horizon is robustly detected in all tested images. The deviation in the attitude estimate between our automated horizon detection and a manual detection is less than 1°.


Fisheye images attitude estimation horizon detection Hough voting 


  1. 1.
    Shabayek, A.E.R., Demonceaux, C., Morel, O., Fofi, D.: Vison Based UAV Attitude Estimation: Progress and Insights. Journal of Intelligent Robot Systems (2012)Google Scholar
  2. 2.
    Demonceaux, C., Vasseur, P., Pégard, C.: Omnidirectional vision on UAV for attitude computation. In: International Conference on Intelligent Robots and Systems (2006)Google Scholar
  3. 3.
    Thurrowgood, S., Soccol, D., Moore, R.J.D., Bland, D., Srinivasan, M.V.: A Vision Based System for Attitude Estimation of UAVs. In: International Conference on Intelligent Robots and Systems (2009)Google Scholar
  4. 4.
    Bao, G., Zhou, Z., Xiong, S., Lin, X., Ye, X.: Towards Micro Air Vehicle Flight Autonomy Research on The Method of Horizon Extraction. IMTC (2003)Google Scholar
  5. 5.
    Hough, P.: Method and means for recogninzing complex patterns. U.S. Patent 3069654 (1962)Google Scholar
  6. 6.
    McGee, T.G., Sengupta, R., Hedrick, K.: Obstacle Detection for Small Autonomous Aircraft Using Sky Segmentation. ICRA (2005)Google Scholar
  7. 7.
    Hwangbo, M., Kanade, T.: Visual-Inertial UAV Attitude Estimation Using Urban Scene Regularities. In: International Conference on Robots and Automation (2006)Google Scholar
  8. 8.
    Baatz, G., Saurer, O., Köser, K., Pollefeys, M.: Large Scale Visual Geo-Localization of Images in Mountainous Terrain. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part II. LNCS, vol. 7573, pp. 517–530. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  9. 9.
    Ying, X., Hu, Z.: Can We Consider Central Catadioptric cameras and Fisheye Cameras within a Unified Imaging Model. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3021, pp. 442–455. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  10. 10.
    Scaramuzza, D., Martinelli, A., Siegwart, R.: A toolbox for Easily Calibrating Omnidirectional Cameras. In: International Conference on Intelligent Robots and Systems (2006)Google Scholar
  11. 11.
    Canny, J.: A computational approach to edge detection. PAMI 8, 679–698 (1986)CrossRefGoogle Scholar
  12. 12.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM 24, 381–395 (1981)MathSciNetCrossRefGoogle Scholar
  13. 13.
  14. 14.

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Bertil Grelsson
    • 1
    • 2
  • Michael Felsberg
    • 1
  1. 1.Computer Vision LaboratoryLinköping UniversitySweden
  2. 2.Saab DynamicsLinköpingSweden

Personalised recommendations