The Integration of a Multimodal MAV and Biomimetic Sensing for Autonomous Flights in Near-Earth Environments
Homeland security and disaster mitigation efforts are often taken place in unforeseen environments that include caves, tunnels, forests, cities, and even inside urban structures. Performing various tasks such as surveillance, reconnaissance, bomb damage assessment or search and rescue within an unfamiliar territory is not only dangerous but it also requires a large, diverse task force. Unmanned robotic vehicles could assist in such missions by providing situational awareness without risking the lives of soldiers, first responders, or other personnel. While ground-based robots have had many successes in search and rescue situations , they move slowly, have trouble traversing rugged terrain, and can still put the operator at risk. Alternatively, small unmanned aerial vehicles (UAVs) can provide soldiers and emergency response personnel with an “eye in the sky” perspective. On an even smaller scale, tiny bird-sized aircraft or micro air vehicles (MAVs) can be designed to fit in a backpack and can be rapidly deployed to provide surveillance and reconnaissance in and around buildings, caves, tunnels and other near-Earth environments. Navigating in these environments, however, remains a challenging problem for UAVs. In , promising results are shown for a rotorcraft equipped with a SICK laser scanner. However, because lift decreases with platform size, carrying this type of sensor on a MAV is not feasible.
KeywordsOptic Flow Inertial Measurement Unit Body Frame Flight Mode Optic Flow Field
Unable to display preview. Download preview PDF.
- 1.Barrows G., Mixed-Mode VLSI Optic Flow Sensors for Micro Air Vehicles, PhD Thesis, University of Maryland, 1999.Google Scholar
- 2.Gibson J. J., The Ecological Approach to Visual Perception, Houghton Mifflin, 1950.Google Scholar
- 3.Green W. E., Oh P. Y., “A MAV That Flies Like an Airplane and Hovers Like a Helicopter”, Proceedings, IEEE/RSJ International Conference on Advanced Intelligent Mechatronics, Monterey, CA, 2005.Google Scholar
- 4.Green W. E., Oh P. Y., Barrows G., “Flying Insect Inspired Vision for Autonomous Aerial Robot Maneuvers in Near-Earth Environments”, Proceedings, IEEE International Conference of Robotics and Automation, New Orleans, LA, 2004.Google Scholar
- 6.Murphy R., Casper J., Hyams J., Micire M., Minten B., “Mobility and Sensing Demands in USAR”, Proceedings, IEEE Industrial Electronics Conference, Vol. 1, 2000.Google Scholar
- 8.Srinivasan M. V., Zhang S. W., Lehrer M., Collett T. S., “Honeybee Navigation En Route to the Goal: Visual Flight Control and Odometry”, Journal of Experimental Biology, 237–243, 1996.Google Scholar
- 10.Tammero L. F., Dickinson M. H., “The Influence of Visual Landscape on the Free Flight Behavior of the Fruit Fly Drosophila Melanogaster”, Journal of Experimental Biology, Vol. 205, 327–343, 2002.Google Scholar
- 11.Tammero L. F., Dickinson M. H., “Collision Avoidance and Landing Responses are Mediated by Separate Pathways in the Fruit Fly, Drosophila Melanogaster”, Journal of Experimental Biology, Vol. 205, 2785–2798, 2002.Google Scholar
- 12.Wertz J. R., Spacecraft Attitude Determination and Control, Reidel Publishing Co, 1978.Google Scholar