Advertisement

A Localization Approach Based on Fixed 3D Objects for Autonomous Robots

  • Chien-Chou Lin
  • Liang-Zheng Huang
  • Hsin-Te Chiang
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 110)

Abstract

In this paper, an object-based localization for mobile robot in real-time environments is proposed. The proposed system consists of a mobile platform and LiDAR. The proposed localization algorithm has 4 steps: (1) scanning the point cloud of the environment by the LiDAR mounted on a robot, (2) ground point removal and object segmentation, (3) recognizing objects with Point Feature Histogram (PFH) features, (4) computing the current position and pose by using the geometry relation between the 3D objects. Comparing with SLAM-based systems, the proposed method is more precise and efficient since the map and mapping are not necessary.

Keywords

LIDAR PFH Surface matching 3D registration Calibration Localization 

Notes

Acknowledgment

This work was financially supported by the “Intelligent Recognition Industry Service Center” from The Featured Areas Research Center Program within the framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan.

References

  1. 1.
    Se, S., Lowe, D., Little, J.: Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks. Int. J. Robot. Res. 21(8), 735–758 (2002)CrossRefGoogle Scholar
  2. 2.
    Hayet, J., Lerasle, F., Devy, M.: Visual landmarks detection and recognition for mobile robot navigation. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, no. 1, pp. 313–318 (2003)Google Scholar
  3. 3.
    Wu, C.J., Tsai, W.H.: Localization estimation for indoor autonomous vehicle navigation by omni-directional vision using circular landmarks on ceilings. Robot Auton. Syst. 57(5), 546 (2009)CrossRefGoogle Scholar
  4. 4.
    Irie, K., Yoshida, T., Tomono, M.: Mobile robot localization using stereo vision in outdoor environments under various illumination conditions. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5175–5181 (2010)Google Scholar
  5. 5.
    Negenborn, R.: Robot localization and Kalman filters on finding your position in a noisy world. Master’s thesis, Utrecht University, Utrecht, The Netherlands (2003)Google Scholar
  6. 6.
    Adams, M., Zhang, S., Xie, L.: Particle filter based outdoor robot localization using natural features extracted from laser scanner. In: IEEE International Conference Robot, pp. 854–859 (2004)Google Scholar
  7. 7.
    Se, S., Lowe, D., Little, J.: Vision-based global localization and mapping for mobile robots. IEEE Trans. Rob. 21, 364–375 (2005)CrossRefGoogle Scholar
  8. 8.
    Lalonde, J.F., Unnikrishnan, R., Vandapel, N., Hebert, M.: Scale Selection for Classification of Point-sampled 3-D Surfaces. In: IEEE International Conference on 3-D Digital Imaging and Modeling, pp. 285–292 (2005)Google Scholar
  9. 9.
    Pauly, M., Keiser, R., Gross, M.: Multi-scale feature extraction on point-sampled surfaces. Comput. Graph. Forum 22(3), 281–289 (2003)CrossRefGoogle Scholar
  10. 10.
    Rusu, R.B., Blodow, N., Marton, Z.C., Beetz, M.: Aligning point cloud views using persistent feature histograms. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3384–3391 (2008)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Chien-Chou Lin
    • 1
  • Liang-Zheng Huang
    • 1
  • Hsin-Te Chiang
    • 1
  1. 1.National Yunlin University of Science and TechnologyYunlinTaiwan, ROC

Personalised recommendations