Abstract
This paper presents work done to control robots of different geometries and capabilities to complete a task neither is capable of independently. The algorithms were tested with a ‘cherry picker’ experiment that required one large, inaccurate robot to lift and carry a smaller, more accurate mobile robot in order to complete an inspection task by placing its end-effector at a specified distance from a visually-specified target. Both mobile robots are controlled using an uncalibrated visual guidance method. An overview of the algorithm for coordinating the two mobile robots is presented, along with details and results of experiments conducted to measure the accuracy with which the shared task is completed. Results have shown the system to be accurate and robust while requiring very little communication between the two robots.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yoder, JD., Seelinger, M. (2006). Visual Coordination of Heterogeneous Mobile Manipulators. In: Ang, M.H., Khatib, O. (eds) Experimental Robotics IX. Springer Tracts in Advanced Robotics, vol 21. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11552246_37
Download citation
DOI: https://doi.org/10.1007/11552246_37
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28816-9
Online ISBN: 978-3-540-33014-1
eBook Packages: EngineeringEngineering (R0)