Skip to main content

Autonomous Navigation Based on Sequential Images for Planetary Landing

  • Conference paper
  • First Online:
Proceedings of the 2015 Chinese Intelligent Systems Conference

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE))

Abstract

A new autonomous navigation scheme for planetary landing is presented. The navigation system contains an inertial measurement unit (IMU) and a stereo camera which can measure unit directional vectors and range information from the camera to detected landmarks. The lander’s motion is estimated by a algorithm known as vision-aided inertial navigation (VAIN). The algorithm uses the unit directional vectors and range measurements of features tracked in two sequential images and the lander’s corresponding poses derived from the IMU and it does not require any a priori terrain information. An augmented implicit extended Kalman filter (IEKF) tightly integrates measurements from the stereo camera and the IMU to produce an accurate estimation of the lander’s pose and velocity and to correct the IMU constant biases. The results of a numerical simulation show that the proposed VAIN method can vastly improve the navigation accuracy of the INS and satisfy the requirements of future planetary exploration missions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Wang D, Huang X, Guan Y (2008) GNC system scheme for lunar soft landing spacecraft. Adv Space Res 42(2):379–385

    Article  Google Scholar 

  2. Busnardo DM, Aitken ML, Tolson RH et al (2011) LIDAR-aided inertial navigation with extended Kalman filtering for pinpoint landing. In: 49th AIAA aerospace science meeting including the new horizons forum and aerospace exposition, Orlando, Florida

    Google Scholar 

  3. Trawny N, Mourikis AI, Roumeliotis SI (2007) Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks. J Field Robot 24(5):357–378

    Article  Google Scholar 

  4. Pham VB, Lacroix S, Devy M (2012) Vision-based absolute navigation for descent and landing. J Field Robot 29(4):627–647

    Article  Google Scholar 

  5. Yu M, Cui HT, Tian Y (2014) A new approach based on crater detection and matching for visual navigation in planetary landing. Adv Space Res 53(12):1810–1821

    Article  Google Scholar 

  6. Kim J, Sukkarieh S (2004) Autonomous airborne navigation in unknown terrain environments. IEEE Trans Aerosp Electron Syst 40(3):1031–1045

    Article  Google Scholar 

  7. Amidi O, Kanade T, Fujita K (1999) A visual odometer for autonomous helicopter flight. Robot Auton Syst 28:185–193

    Article  Google Scholar 

  8. Indelman V, Gurfil P, Rivlin E et al (2012) Real-time vision-aided localization and navigation based on three-view geometry. IEEE Trans Aerosp Electron Syst 48(3):2239–2259

    Article  Google Scholar 

  9. Mourikis AI, Trawny N, Roumeliotis SI, Johnson AE et al (2009) Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans Robot 25(2):264–280

    Article  Google Scholar 

  10. Pini G, Hector R (2001) Partial aircraft state estimation from visual motion using the subspace constraints approach. J Guid Control Dyn 24(5):1016–1028

    Article  Google Scholar 

  11. Roumeliotis SI, Burdick JW (2002) Stochastic cloning: a generalized framework for processing relative state measurements, In: IEEE international conference on robotics and automation, Washington, DC, pp. 1788–1795

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dayi Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Xu, C., Wang, D., Huang, X. (2016). Autonomous Navigation Based on Sequential Images for Planetary Landing. In: Jia, Y., Du, J., Li, H., Zhang, W. (eds) Proceedings of the 2015 Chinese Intelligent Systems Conference. Lecture Notes in Electrical Engineering. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-48365-7_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-48365-7_41

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-48363-3

  • Online ISBN: 978-3-662-48365-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics