Advertisement

Journal of Central South University

, Volume 18, Issue 3, pp 627–632 | Cite as

Plane extraction for navigation of humanoid robot

  • Tong Zhang (张彤)Email author
  • Nan-Feng Xiao (肖南峰)
Article

Abstract

In order to make the humanoid robot walk freely in complicated circumstance, the reliable capabilities for obtaining plane information from its surroundings are demanded. A system for extracting planes from data taken by stereo vision was presented. After the depth image was obtained, the pixels of each line were scanned and split into straight line segments. The neighbouring relation of line segments was kept in link structure. The groups of three line segments were selected as seed regions. A queue was maintained for storing seed regions, and then the plane region was expanded around the seed region. The process of region growing continued until the queue of seed regions was empty. After trimming, the edges of the planes became smooth. In the end, extracted planes were obtained. In the experiment, two models were used: pipe and stairs. Two planes in pipe model and six planes in stairs model were extracted exactly. The speed and precision of algorithm can satisfy the demands of humanoid robot’s navigation.

Key words

humanoid robot navigation line segments splitting region growing plane extraction depth image 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    GUTMANN J S, FUKUCHI M, FUJITA M. Stair climbing for humanoid robots using stereo vision [C]// Int Conf on Intelligent Robots and Systems (IROS). Sendai, Japan, 2004: 1407–1413.Google Scholar
  2. [2]
    KANEHIRO F, HIRUKAWA H, KANEKO K, KAJITA S, FUJIWARA K, HARADA K, YOKOI K. Locomotion planning of humanoid robots to pass through narrow spaces [C]// Int Conf on Robotics and Automation (ICRA). New Orleans, 2004: 604–609.Google Scholar
  3. [3]
    GUAN Y, YOKOI K, SIAN N E, TANIE K. Feasibility of humanoid robots stepping over obstacles [C]// Int Conf on Intelligent Robots and Systems (IROS). Sendai, Japan, 2004: 130–135.Google Scholar
  4. [4]
    CHESTNUTT J, LAU M, CHEUNG G, KUFFNER J, HODGINS J, KANADE T. Footstep planning for the Honda ASIMO humanoid [C]// Proceedings of the IEEE International Conference on Robotics and Automation(ICRA’05). Piscataway, NJ, USA, 2005: 629–634.Google Scholar
  5. [5]
    KUFFNER J J, KAGAMI S. Dynamically-stable motion planning for humanoid robots [J]. Autonomous Robots, 2002, 12(1): 105–118.CrossRefGoogle Scholar
  6. [6]
    ZONG Ying, SHI Wen, LI Xu. Optimal sagittal gait with ZMP stability during complete walking cycle for humanoid robots [J]. Journal of Control Theory and Applications, 2007, 5(2): 133–138.zbMATHGoogle Scholar
  7. [7]
    ZOU Xiao-bing, CAI Zi-xing, SUN Guo-rung. Non-smooth environment modeling and global path planning for mobile robots [J]. Journal of Central South University of Technology, 2003, 10(3): 248–254.CrossRefGoogle Scholar
  8. [8]
    CHEN Xi, TAN Guan-zheng, JIANG Bin. Real-time optimal path planning for mobile robots based on immune genetic algorithm [J]. Journal of Central South University of Technology: Science and Technology, 2008, 39(3): 577–583. (in Chinese)CrossRefGoogle Scholar
  9. [9]
    GUTMANN J S, FUKUCHI M, FUJITA M. Real-time path planning for humanoid robot navigation [C]// Int Joint Conference on Artificial Intelligence. San Francisco, CA, USA, 2005: 1232–1237.Google Scholar
  10. [10]
    ZHANG Tong, XIAO Nan-feng. Real-time map building for path planning of a humanoid robot [C]// Asia-Pacific Conference on Information Processing, 2009: 211–214.Google Scholar
  11. [11]
    OKADA K, KAGAMI S, INABA M, INOUE H. Plane segment finder: Algorithm, implementation and applications [C]// Proceedings of the 2001 IEEE International Conference on Robotics & Automation. Seoul, Korea, 2001: 2120–2125.Google Scholar
  12. [12]
    CHEN Ze-zhi, PEARS N E, LIANG Bo-jian, McDERMID J. Plane segmentation from two views in reciprocal-polar image space [C]// International Conference on Image Analysis and Recognition. Porto, Portugal, 2004: 638–646.Google Scholar
  13. [13]
    LUD C, SU L, ZHU F, SHI Z. A general method for omnidirectional stereo camera calibration based on neural network optimizationl [J]. Lecture Notes in Computer Science, 2006, 3972: 383–389.CrossRefGoogle Scholar
  14. [14]
    BANKS J, CORKE P. Quantitative evaluation of matching methods and validity measures for stereo vision [J]. The International Journal of Robotics Research, 2001, 18(3): 512–532.CrossRefGoogle Scholar
  15. [15]
    KAGAMI S, TAKAOKA Y, KIDA Y, NISHIWAKI K, KANADE T. Online dense local 3D world reconstruction from stereo image sequences [C]// Proc of the IEEE/RSJ Int Conf on Intelligent Robots and Systems (IROS’05). Edmonton, Canada, 2005: 2999–3004.Google Scholar

Copyright information

© Central South University Press and Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Tong Zhang (张彤)
    • 1
    • 2
    Email author
  • Nan-Feng Xiao (肖南峰)
    • 1
  1. 1.School of Computer Science and EngineeringSouth China University of TechnologyGuangzhouChina
  2. 2.Computer DepartmentGuangdong Police Officers CollegeGuangzhouChina

Personalised recommendations