Abstract
Autonomous operation of small UAVs in cluttered environments requires three important foundations: fast and accurate knowledge about position in the world for control; obstacle detection and avoidance for safe flight; and all of this has to be executed in real-time onboard the vehicle. This is a challenge for micro air vehicles, since their limited payload demands small, lightweight, and low-power sensors and processing units, favoring vision-based solutions that run on small embedded computers equipped with smart phone-based processors. In the following chapter, we present the JPL autonomous navigation framework for micro air vehicles to address these challenges. Our approach enables power-up-and-go deployment in highly cluttered environments without GPS, using information from an IMU and a single downward-looking camera for pose estimation, and a forward-looking stereo camera system for disparity-based obstacle detection and avoidance. As an example of a high-level navigation task that builds on these autonomous capabilities, we introduce our approach for autonomous landing on elevated flat surfaces, such as rooftops, using only monocular vision inputs from the downward-looking camera.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Agrawal M, Konolige K, Blas MR (2008) Censure: center surround extremas for realtime feature detection and matching. Computer vision ECCV 2008. Lecture Notes in Computer Science, vol 5305. Springer, Berlin, pp 102–115
Armesto L, Tornero J, Vincze M (2007) Fast ego-motion estimation with multi-rate fusion of inertial and vision. Int J Robot Res 26(6):577–589
Armesto L, Chroust S, Vincze M, Tornero J (2004) Multi-rate fusion with vision and inertial sensors. In: Proceedings of the IEEE international conference on robotics and automation, New Orleans, US
Ascending Technologies GmbH. http://www.asctec.de/uav-applications/research/products/asctec-mastermind/
Bachrach A, Prentice S, He R, Henry P, Huang AS, Krainin M, Maturana D, Fox D, Roy N (2012) Estimation, planning and mapping for autonomous flight using an RGB-D camera in GPS-denied environments. Int J Robot Res 31(11):1320–1343
Bajracharya M, Howard A, Matthies L, Tang B, Turmon M (2009) Autonomous off-road navigation with end-to-end learning for the LAGR program. Field Robot 26(1):3–25
Bakolas E, Tsiotras P (2008) Multiresolution path planning via sector decompositions compatible to on-board sensor data. In: AIAA guidance, navigation, and control conference
Baldwin G, Mahony R, Trumpf J (2009) A nonlinear observer for 6 DOF pose estimation from inertial and bearing measurements. In: Proceedings of the IEEE international conference on robotics and automation, Kobe
Bay H, Ess A, Tuytelaars T, Van Gool L (2008) Speeded-up robust features (surf). Comput Vis Image Underst 110(3):346–359
Bosch S, Lacroix S, Caballero F (2006) Autonomous detection of safe landing areas for an uav from monocular images. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 5522–5527
Brockers R, Susca S, Zhu D, Matthies L (2012) Fully self-contained vision-aided navigation and landing of a micro air vehicle independent from external sensor inputs. In: Proceedings of the SPIE, 8387:83870Q-1–83870Q-10
Cheng Y (2010) Real-time surface slope estimation by homography alignment for spacecraft safe landing. In: Proceedings of the IEEE international conference on robotics and automation, pp 2280–2286
Chroust SG, Vincze M (2004) Fusion of vision and inertial data for motion and structure estimation. J Robot Syst 21(2):73–83
Conroy J, Gremillion G, Ranganathan B, Humbert J (2009) Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton Robot 27(3):189–198
Corke P (2004) An inertial and visual sensing system for a small autonomous helicopter. Int J Robot Syst 21(2):43–51
Crazyflie Micro Quadrotor. http://www.bitcraze.se/crazyflie/
Di K, Li R (2004) CAHVOR camera model and its photogrammetric conversion for planetary applications. J Geophys Res 109:E04004
Fraundorfer F, Heng L, Honegger, D, Lee GH, Meier L, Tanskanen P, Pollefeys M (2012) Vision-based autonomous mapping and exploration using a quadrotor MAV. In: IROS, pp 4557–4564
Gemeiner P, Einramhof P, Vincze M (2007) Simultaneous motion and structure estimation by fusion of inertial and vision data. Int J Robot Res 26(6):591–605
Goldberg SB, Matthies L (2011) Stereo and IMU assisted visual odometry on an OMAP3530 for small robots. In: 2011 IEEE computer society conference on computer vision and pattern recognition workshops (CVPRW), pp 169–176
Hardkernel. http://www.hardkernel.com
How JP, Bethke B, Frank A, Dale D, Vian J (2008) Real-time indoor autonomous vehicle test environment. IEEE Control Syst Mag 28(2):51–64
Hrabar S, Sukhatme GS, Corke P, Usher K, Roberts J (2005) Combined optic-flow and stereo-based navigation of urban canyons for a uav. In: IROS
Huster A, Frew EW, Rock SM (2002) Relative position estimation for AUVs by fusing bearing and inertial rate sensor measurements. In: Proceedings of the oceans conference, vol 3. MTS/IEEE, Biloxi, pp 1857–1864
Hyslop AM, Humbert JS (2010) Autonomous navigation in three-dimensional urban environments using wide-field integration of optic flow. Guid Control Dyn 33(1):147
Johnson A, Montgomery J, Matthies L (2005) Vision guided landing of an autonomous helicopter in hazardous terrain. In: Proceedings of the IEEE international conference on robotics and automation, pp 3966–3971
Jones E (2009) Large scale visual navigation and community map building. PhD thesis, University of California at Los Angeles
Jones E, Soatto S (2010) Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. Int J Robot Res 30:407–430
Kelly J, Sukhatme GS (2011) Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Int J Robot Res (IJRR) 30(1):56–79
Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: Proceedings of the 2007 6th IEEE and ACM international symposium on mixed and augmented reality, ISMAR’07. IEEE Computer Society, p 110
Kuwata Y, Teo J, Fiore G, Karaman S, Frazzoli E, How JP (2009) Real-time motion planning with applications to autonomous urban driving. Trans Control Syst Tech 17(5):1105–1118
Luders B, Karaman S, Frazzoli E, How J (2010) Bounds on tracking error using closed-loop rapidly-exploring random trees. In: American control conference, Baltimore, MD, pp 5406–5412
Lupton T, Sukkarieh S (2008) Removing scale biases and ambiguity from 6DoF monocular SLAM using inertial. In: International conference on robotics and automation, Pasadena, California
Lupton T, Sukkarieh S (2009) Efficient integration of inertial observations into visual SLAM without initialization. In: IEEE/RSJ international conference on intelligent robots and systems, St. Louis
MacAllister B, Butzke J, Kushleyev A, Pandey H, Likhachev M (2013) Path planning for non-circular micro aerial vehicles in constrained environments. In: ICRA, pp 3918–3925
Martin GR (2009) What is binocular vision for? a birds eye view. J Vis 9(11):245–267
Meingast M, Geyer C, Sastry S (2004) Vision based terrain recovery for landing unmanned aerial vehicles. In: Proceedings of the IEEE conference on decision and control, vol 2. pp 1670–1675
Mei C, Sibley G, Cummins M, Newman P, Reid I (2009) A constant time efficient stereo SLAM system. In: Proceedings of the British machine vision conference (BMVC)
Mellinger D, Kumar V (2011) Minimum snap trajectory generation and control for quadrotors. In: Proceedings of the IEEE international conference on robotics and automation (ICRA)
Montgomery J, Johnson A, Roumeliotis S, Matthies L (2006) The jet propulsion laboratory autonomous helicopter testbed: a platform for planetary exploration technology research and development. J Field Robot 23(3–4):245–267
Mourikis AI, Roumeliotis SI (2007) A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings of the IEEE international conference on robotics and automation (ICRA)
Mourikis AI, Trawny N, Roumeliotis SI, Johnson AE, Ansar A, Matthies L (2009) Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans Robot 25(2):264–280
Newcombe RA, Lovegrove JS, Davison AJ (2011) Dtam: dense tracking and mapping in real-time. In: IEEE international conference on computer vision (ICCV), pp 2320–2327
Otte MW, Richardson SG, Mulligan J, Grudic G (2009) Path planning in image space for autonomous robot navigation in unstructured outdoor environments. Field Robot 26(2):212–240
Pivtoraiko M, Mellinger D, Kumar V (2013) Incremental micro-UAV motion replanning for exploring unknown environments. In: ICRA
Pizzoli M, Forster C, Scaramuzza D (2014) Remode: probabilistic, monocular dense reconstruction in real time. In: Proceedings of the IEEE international conference on robotics and automation
Qian G, Chellappa R, Zheng Q (2002) Bayesian structure from motion using inertial information. In: International conference on image processing, Rochester, New York
Richter C, Bry A, Roy N (2013) Polynomial trajectory planning for quadrotor flight. In: RSS workshop on resource-efficient integration of perception, control and navigation
Robot Operating System, (ROS). http://www.ros.org
Ross S, Melik-Barkhudarov N, Shankar KS, Wendel A, Dey D, Bagnell JA, Hebert M (2013) Learning monocular reactive uav control in cluttered natural environments. In: ICRA, pp 1757–1764
Roumeliotis SI, Johnson AE, Montgomery JF (2002) Augmenting inertial navigation with image-based motion estimation. In: Proceedings of The IEEE international conference on robotics and automation, Washington, pp 4326–4333
Sarabandi K, Vahidpour M, Moallem M, East J (2011) Compact beam scanning 240 GHz radar for navigation and collision avoidance. In: SPIE, vol 8031
Scherer S, Chamberlain L, Singh S (2012) Autonomous landing at unprepared sites by a full-scale helicopter. Robot Auton Syst 60(12):1545–1562
Schouwenaars T, De Moor B, Feron E, How J (2001) Mixed Integer Programming for Multi-Vehicle Path Planning. In: Proceedings of the European control conference, Porto, Portugal
Seitz SM, Curless B, Diebel J, Scharstein D, Szeliski R (2006) A comparison and evaluation of multi-view stereo reconstruction algorithms. In: IEEE computer society conference on computer vision and pattern recognition, 2006, pp 519–528
Shen S, Michael N, Kumar V (2011) 3d indoor exploration with a computationally constrained mav. In: Robotics science and systems
Shen S, Michael N, Kumar V (2011) Autonomous multi-floor indoor navigation with a computationally constrained MAV. In: Proceedings of the IEEE international conference on robotics and automation
Strelow D, Singh S (2003) Online motion estimation from image and inertial measurements. In: Workshop on integration of vision and inertial sensors (INERVIS), Coimbra, Portugal
Stühmer J, Gumhold S, Cremers D (2010) Real-time dense geometry from a handheld camera. In: Proceedings of the 32nd DAGM conference on pattern recognition, pp 11–20
Templeton T, Shim DH, Geyer C, Sastry SS (2007) Autonomous vision-based landing and terrain mapping using an MPC-controlled unmanned rotorcraft. In: Proceedings of the IEEE international conference on robotics and automation, pp 1349–1356
Theodore C, Rowley D, Hubbard D, Ansar A, Matthies L, Goldberg S, Whalley M (2006) Flight trials of a rotorcraft unmanned aerial vehicle landing autonomously at unprepared sites. In: Forum of the American helicopter society, Phoenix
Weiss S (2012) Vision based navigation for micro helicopters. PhD thesis, ETH Zurich, March 2012
Weiss S, Achtelik MW, Lynen S, Achtelik MC, Kneip L, Chli M, Siegwart R (2013) Monocular vision for long-term micro aerial vehicle state estimation: a compendium. Field Robot 30(5):803–831
Weiss S, Achtelik MW, Chli M, Siegwart R (2012) Versatile distributed pose estimation and sensor self-calibration for an autonomous MAV. In: IEEE International conference on robotics and automation (ICRA)
Weiss S, Achtelik MW, Lynen S, Chli M, Siegwart R (2012) Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In: IEEE International conference on robotics and automation (ICRA)
Weiss S, Brockers R, Matthies L (2013) 4dof drift free navigation using inertial cues and optical flow. In: IEEE/RSJ International conference on intelligent robots and systems (IROS), pp 4180–4186
Weiss S, Siegwart R (2011) Real-time metric state estimation for modular vision-inertial systems. In: Proceedings of the IEEE International conference on robotics and automation (ICRA)
Yu H, Beard RW (2013) A vision-based collision avoidance technique for miniature air vehicles using local-level frame mapping and path planning. Auton Robots 34(1–2):93–109
Acknowledgments
This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Brockers, R., Humenberger, M., Kuwata, Y., Matthies, L., Weiss, S. (2014). Computer Vision for Micro Air Vehicles. In: Kisačanin, B., Gelautz, M. (eds) Advances in Embedded Computer Vision. Advances in Computer Vision and Pattern Recognition. Springer, Cham. https://doi.org/10.1007/978-3-319-09387-1_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-09387-1_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-09386-4
Online ISBN: 978-3-319-09387-1
eBook Packages: Computer ScienceComputer Science (R0)