Robot Localization Using Omnidirectional Color Images
We describe a vision-based indoor mobile robot localization algorithm that does not require historical position estimates. The method assumes the presence of an a priori map and a reference omnidirectional view of the workspace. The current omnidirectional image of the environment is captured whenever the robot needs to relocalise. A modified hue profile is generated for each of the incoming images and compared with that of the reference image to find the correspondence. The current position of the robot can then be determined using triangulation as both the reference position and the map of the workspace are available. The method was tested by mounting the camera system at a number of random positions positions in a 11.0m × 8.5 m room. The average localization error was 0.45 m. No mismatch of features between the reference and incoming image was found amongst the testing cases.
Unable to display preview. Download preview PDF.
- 3.H. R. Everett: Sensors for Mobile Robots: Theory and Application. A. K. Peters Ltd., (1995). 168Google Scholar
- 7.B. Yamauchi: Mobile robot localization in dynamic environment using dead reckoning and evidence grids. In: Proceed. of the IEEE Internat. Conf. on Robotics and Automation, Minneapolis, Minnesota, (April 1996) 1401–1406. 168Google Scholar