Advertisement

Robot Localization Using Omnidirectional Color Images

  • David C. K. Yuen
  • Bruce A. MacDonald
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1998)

Abstract

We describe a vision-based indoor mobile robot localization algorithm that does not require historical position estimates. The method assumes the presence of an a priori map and a reference omnidirectional view of the workspace. The current omnidirectional image of the environment is captured whenever the robot needs to relocalise. A modified hue profile is generated for each of the incoming images and compared with that of the reference image to find the correspondence. The current position of the robot can then be determined using triangulation as both the reference position and the map of the workspace are available. The method was tested by mounting the camera system at a number of random positions positions in a 11.0m × 8.5 m room. The average localization error was 0.45 m. No mismatch of features between the reference and incoming image was found amongst the testing cases.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ch. Balkenius: Spatial learning with perceptually grounded representations. Robotics and Autonomous Systems, 25 (1998) 165–175. 168CrossRefGoogle Scholar
  2. 2.
    A. Elfes: Sonar-based real-world mapping and navigation. IEEE Journal of Robotics and Automation, RA-3 (1987) 249–265. 168CrossRefGoogle Scholar
  3. 3.
    H. R. Everett: Sensors for Mobile Robots: Theory and Application. A. K. Peters Ltd., (1995). 168Google Scholar
  4. 4.
    L. Jetto, S. Longhi, G. Venturini: Development and experimental validation of an adaptive extended kalman filter for the localization of mobile robots. IEEE Transactions on Robotics and Automation, 15 (1999) 219–229. 167CrossRefGoogle Scholar
  5. 5.
    J. J. Leonard, H. F. Durrant-Whyte: Mobile robot localization by tracking geometric beacons. IEEE Transactions on Robotics and Automation, 7 (1991) 376–382. 168CrossRefGoogle Scholar
  6. 6.
    L.-J. Lin, Th.R. Hancock, J. S. Judd: A robust landmark-based system for vehicle location using low-bandwidth vision. Robotics and Autonomous Systems, 25 (1998) 19–32. 168CrossRefGoogle Scholar
  7. 7.
    B. Yamauchi: Mobile robot localization in dynamic environment using dead reckoning and evidence grids. In: Proceed. of the IEEE Internat. Conf. on Robotics and Automation, Minneapolis, Minnesota, (April 1996) 1401–1406. 168Google Scholar
  8. 8.
    J. Zhang, A. Knoll, V. Schwert: Situated neuro-fuzzy control for vision-based robot localisation. Robotics and Autonomous Systems, 28 (1999) 71–82. 168zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • David C. K. Yuen
    • 1
  • Bruce A. MacDonald
    • 1
  1. 1.The Department of Electrical and Electronic EngineeringThe University of AucklandAucklandNew Zealand

Personalised recommendations