Abstract
This paper proposes and evaluates the implementation of a self-localization system intended for use in Unmanned Aerial Vehicles (UAVs). Accurate localization is necessary for UAVs for efficient stabilization, navigation and collision avoidance. Conventionally, this requirement is fulfilled using external hardware infrastructure, such as Global Navigation Satellite System (GNSS) or camera-based motion capture system (VICON-like [37]). These approaches are, however, not applicable in environments where deployment of cumbersome motion capture equipment is not feasible, as well as in GNSS-denied environments. Systems based on Simultaneous Localization and Mapping (SLAM) require heavy and expensive onboard equipment and high amounts of data transmissions for sharing maps between UAVs. Availability of a system without these drawbacks is crucial for deployment of tight formations of multiple fully autonomous micro UAVs for both outdoor and indoor missions. The project was inspired by the often used sensor PX4FLOW Smart Camera [12]. The aim was to develop a similar sensor, but without the multiple drawbacks observed in its use, as well as to make the operation of it more transparent and to make it independent of a specific hardware. Our proposed solution requires only a lightweight camera and a single-point range sensor. It is based on optical flow estimation from consecutive images obtained from downward-facing camera, coupled with a specialized RANSAC-inspired post-processing method that takes into account flight dynamics. This filtering makes it more robust against imperfect lighting, homogenous ground patches, random close objects and spurious errors. These features make this approach suitable even for coordinated flights through demanding forest-like environment. The system is designed mainly for horizontal velocity estimation, but specialized modifications were also made for vertical speed and yaw rotation rate estimation. These methods were tested in a simulator and subsequently in real world conditions. The tests showed, that the sensor is suitably reliable and accurate to be usable in practice.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We do not consider central vector \(\varvec{w}_{22}\), since it is unaffected by rotational and vertical movement in our model.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
See http://gazebosim.org.
- 9.
For further reference, we took videos from the experiments. They can be viewed on YouTube. Link to playlist is: https://www.youtube.com/playlist?list=PLSwHw6pigPZqNijnZfIL8_-otOzRgdQwV.
- 10.
This experiment was performed before optimizing the number of sections.
- 11.
To gain a better overview, the processed video with 16 sections (120 px) was recorded and uploaded to http://youtu.be/bFa2c0LzPZ4.
References
Báča, T., Loianno, G., Saska, M.: Embedded model predictive control of unmanned micro aerial vehicles. In: 21st International Conference on Methods and Models in Automation and Robotics (MMAR) (2016)
Briod, A., Zufferey, J.C., Floreano, D.: Optic-flow based control of a 46g quadrotor. In: Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments, IROS 2013 (2013)
Aasish, C., Ranjitha, E., Ridhwan, R.: Navigation of UAV without GPS. In: 2015 International Conference on Robotics, Automation, Control and Embedded Systems (RACE), pp. 1–3, February 2015
Chudoba, J., Kulich, M., Saska, M., Báča, T., Přeučil, L.: Exploration and mapping technique suited for visual-features based localization of MAVs. J. Intell. Rob. Syst. 84(1), 351–369 (2016). First online
Faigl, J., Krajník, T., Chudoba, J., Preucil, L., Saska, M.: Low-cost embedded system for relative localization in robotic swarms. In: International Conference on Robotics and Automation (ICRA), pp. 993–998. IEEE (2013)
Foroosh, H., Zerubia, J., Berthod, M.: Extension of phase correlation to subpixel registration. IEEE Trans. Image Process. 11, 188–200 (2002)
Gageik, N., Strohmeier, M., Montenegro, S.: An autonomous UAV with an optical flow sensor for positioning and navigation. Int. J. Adv. Rob. Syst. 10(10), 341 (2013). http://dx.doi.org/10.5772/56813
Grabe, V., Blthoff, H.H., Giordano, P.R.: On-board velocity estimation and closed-loop control of a quadrotor UAV based on optical flow. In: 2012 IEEE International Conference on Robotics and Automation, pp. 491–497, May 2012
Heinrich, A.: An Optical Flow Odometry Sensor Based on the Raspberry Pi Computer. Master’s thesis, Czech Technical University in Prague (2017)
Herissé, B., Hamel, T., Mahony, R., Russotto, F.X.: Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow. IEEE Trans. Rob. 28(1), 77–89 (2012)
Hérissé, B., Hamel, T., Mahony, R., Russotto, F.X.: A terrain-following control approach for a VTOL unmanned aerial vehicle using average optical flow. Auton. Robots 29(3), 381–399 (2010)
Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp. 1736–1741, May 2013
Horn, B.K., Schunck, B.G.: Determining optical flow. Artif. Intell. 17(1), 185–203 (1981). http://www.sciencedirect.com/science/article/pii/0004370281900242
Itseez: Motion analysis and object tracking (2015). http://docs.opencv.org/3.1.0/d7/df3/group__imgproc__motion.html#ga552420a2ace9ef3fb053cd630fdb4952
Joos, M., Ziegler, J., Stiller, C.: Low-cost sensors for image based measurement of 2D velocity and yaw rate. In: 2010 IEEE Intelligent Vehicles Symposium, pp. 658–662, June 2010
Kim, J., Brambley, G.: Dual optic-flow integrated navigation for small-scale flying robots. In: Proceedings of Australasian Conference on Robotics and Automation, Brisbane, Australia (2007)
Kohout, P.: Object carrying by couple of UAVS. https://youtu.be/nVWqOCK6x24
Krajník, T., Nitsche, M., Faigl, J., Vaněk, P., Saska, M., Přeučil, L., Duckett, T., Mejail, M.: A practical multirobot localization system. J. Intell. Rob. Syst. 76(3–4), 539–562 (2014)
Krátkí, V.: Filming by a multi-robot formation (three point lighting method) - real experiment. https://youtu.be/CuXX3hlA7Hk
More, V., Kumar, H., Kaingade, S., Gaidhani, P., Gupta, N.: Visual odometry using optic flow for unmanned aerial vehicles. In: 2015 International Conference on Cognitive Computing and Information Processing(CCIP), pp. 1–6, March 2015
Petráček, P.: Swarm deployment of helicopters in forest-like environment. https://www.youtube.com/watch?v=hqHW6jYTBEY&index=1&list=PLooTKzV6hvpNF3bTfiOuMZbr2n_tGw0td
PX4: Px4flow smart camera (2013). http://pixhawk.org/modules/px4flow. Website; version as of 29th April 2017
Romero, H., Salazar, S., Lozano, R.: Real-time stabilization of an eight-rotor UAV using optical flow. IEEE Trans. Rob. 25(4), 809–817 (2009)
Santamaria-Navarro, A., Solà, J., Andrade-Cetto, J.: High-frequency MAV state estimation using low-cost inertial and optical flow measurement units. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1864–1871, September 2015
Saska, M.: MAV-swarms: unmanned aerial vehicles stabilized along a given path using onboard relative localization. In: 2015 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 894–903, June 2015
Saska, M., Chudoba, J., Přeučil, L., Thomas, J., Loianno, G., Třešňák, A., Vonásek, V., Kumar, V.: Autonomous deployment of swarms of micro-aerial vehicles in cooperative surveillance. In: 2014 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 584–595, May 2014
Saska, M., Vakula, J., Přeučil, L.: Swarms of micro aerial vehicles stabilized under a visual relative localization. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 3570–3575, May 2014
Saska, M., Báča, T., Thomas, J., Chudoba, J., Preucil, L., Krajník, T., Faigl, J., Loianno, G., Kumar, V.: System for deployment of groups of unmanned micro aerial vehicles in GPS-denied environments using onboard visual relative localization. Auton. Robots 41(4), 919–944 (2017)
Saska, M., Kasl, Z., Přeucil, L.: Motion planning and control of formations of micro aerial vehicles. IFAC Proc. Vol. 47(3), 1228–1233 (2014). 19th IFAC World Congress of the International Federation of Automatic Control (IFAC)
Saska, M., Krajník, T., Vonásek, V., Kasl, Z., Spurný, V., Přeučil, L.: Fault-tolerant formation driving mechanism designed for heterogeneous MAVs-UGVs groups. J. Intell. Rob. Syst. 73(1), 603–622 (2014)
Saska, M., Spurný, V., Vonásek, V.: Predictive control and stabilization of nonholonomic formations with integrated spline-path planning. Robot. Auton. Syst. Part B 75, 379–397 (2016)
Saska, M., Vakula, J., Přeućil, L.: Swarms of micro aerial vehicles stabilized under a visual relative localization. In: IEEE International Conference on Robotics and Automation (ICRA). IEEE (2014)
Saska, M., Vonásek, V., Chudoba, J., Thomas, J., Loianno, G., Kumar, V.: Swarm distribution and deployment for cooperative surveillance by micro-aerial vehicles. J. Intell. Rob. Syst. 84(1), 469–492 (2016)
Saska, M., Vonásek, V., Krajník, T., Přeučil, L.: Coordination and navigation of heterogeneous MAV-UGV formations localized by a hawk-eye-like approach under a model predictive control scheme. Int. J. Rob. Res. 33(10), 1393–1412 (2014)
Stowers, J., Bainbridge-Smith, A., Hayes, M., Mills, S.: Optical flow for heading estimation of a quadrotor helicopter. Int. J. Micro Air Veh. 1(4), 229–239 (2009)
Tersus-GNSS: Precis-bx305 gnss rtk board (2017). https://cdn.shopify.com/s/files/1/0928/6900/files/Datasheet_Precis-BX305_EN.pdf?336381172763480191. Datasheet; version as of 7th May 2017
Vicon Motion Systems Ltd: Vicon object tracking. https://www.vicon.com/motion-capture/engineering
Acknowledgments
The presented work has been supported by the Czech Science Foundation(GACR) under research project No. 16- 24206S and by the Grant Agency of the Czech Technical University in Prague under grant No. SGS15/157/13.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Walter, V., Novák, T., Saska, M. (2018). Self-localization of Unmanned Aerial Vehicles Based on Optical Flow in Onboard Camera Images. In: Mazal, J. (eds) Modelling and Simulation for Autonomous Systems. MESAS 2017. Lecture Notes in Computer Science(), vol 10756. Springer, Cham. https://doi.org/10.1007/978-3-319-76072-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-76072-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-76071-1
Online ISBN: 978-3-319-76072-8
eBook Packages: Computer ScienceComputer Science (R0)