Advertisement

Vision Based Pose Estimation of Multiple Peg-in-Hole for Robotic Assembly

  • Pitchandi NagarajanEmail author
  • S. Saravana Perumaal
  • B. Yogameena
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10481)

Abstract

Vision sensors are used to estimate the pose (position and orientation) of mating components in a vision assisted robotic peg-in-hole assembly which is a crucial step in aligning the mating hole-component with the corresponding moving peg-component. The accuracy of this estimation decides the performance of peg-in-hole robotic assembly with an appropriate mapping between the image and task environment using a fixed overhead camera or camera on robot arm. The wheel and hub assembly in automobile has multiple holes and pegs in their mating parts which lead to more complex pose estimation procedure. The success rate of the assembly process (without jamming) is affected by an inaccurate pose estimation which leads to lateral and/or axial misalignment between the mating components during its insertion phase. On this consideration, this work proposes a pose estimation algorithm for a multiple peg-in-hole assembly with the use of genetic algorithm based two-stage camera calibration procedure. The proposed algorithm has also been tested for its performance in estimating the pose of the multiple pegs in wheel-hub of a car. The result reveals that the proposed method estimates the pose of the pegs accurately with minimum re-projection error.

Keywords

Pose estimation Multiple peg-in-hole Camera calibration Genetic algorithm Robot assembly 

References

  1. 1.
    Bay, H., Tuytelaars, T., Van Gool, L.: SURF: speeded up robust features. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006. LNCS, vol. 3951, pp. 404–417. Springer, Heidelberg (2006). doi: 10.1007/11744023_32 CrossRefGoogle Scholar
  2. 2.
    Chaumette, F., Hutchinson, S.: Visual servo control part I: basic approaches. IEEE Robot. Autom. Mag. 13, 82–90 (2006)CrossRefGoogle Scholar
  3. 3.
    Chaumette, F.: Potential problems of stability and convergence in image based and position based visual servoing. In: Kriegman, D.J., Hager, G.D., Morse, A.S. (eds.) The Confluence of Vision and Control. LNCIS, vol. 237, pp. 66–78. Springer, London (1998). doi: 10.1007/BFb0109663 CrossRefGoogle Scholar
  4. 4.
    Dong, G., Zhu, Z.: Position based visual servo control of autonomous robotic manipulators. Acta Astronaut. 115, 291–302 (2015)CrossRefGoogle Scholar
  5. 5.
    Ji, Q., Zhang, Y.: Camera calibration with genetic algorithms. IEEE Trans. Syst. Man. Cybern. Part A: Syst. Hum. 31, 120–130 (2001)Google Scholar
  6. 6.
    Lee, S., Kim, E., Park, Y.: 3D object recognition using multiple features for robotic manipulation. In: Proceedings of IEEE International Conference on Robotics and Automation, USA, pp. 3768–3774 (2006)Google Scholar
  7. 7.
    Lenz, R., Tsai, R.: Techniques for calibration of the scale factor and image center for high accuracy 3D machine vision metrology. In: Proceedings of IEEE International Conference on Robotics and Automation, pp. 68–75 (1987)Google Scholar
  8. 8.
    Li, W., Gee, T., Friedrich, H., Delmas, P.: A practical comparison between Zhang’s and Tsai’s calibration approaches. In: Proceedings of International Conference on Image and Vision Computing (IVCNZ), New Zealand, pp. 166–171, November 2014Google Scholar
  9. 9.
    Mallis, E.: Survey of vision based robot control. In: Proceedings of European Naval Ship Design, Captain Computer IV Forum, Brest, France, pp. 1–16 (2002)Google Scholar
  10. 10.
    Marchand, E., Boutherny, P., Chaumette, F., Moreau, V.: Robust real time visual tracking using a 2D-3D model based approach. In: Proceedings of the 1999 IEEE International Conference on Computer Vision, pp. 262–268 (1999)Google Scholar
  11. 11.
  12. 12.
    Nagarajan, P., Saravana Perumaal, S.: GA based camera calibration for vision assisted robotic assembly system. IET Comput. Vision (2016). doi: 10.1049/iet-cvi.2016.0004 Google Scholar
  13. 13.
    Song, K.T., Chang, C.H.: Object pose estimation for grasping based on robust center point detection. In: proceedings of 8th Asian Control Conference, Taiwan , pp. 305–310 (2011)Google Scholar
  14. 14.
    Song, X., Yang, B., Feng, Z., et al.: Camera calibration based on particle swarm optimization. In: Proceedings of IEEE International Conference on Image and Signal Processing, Tianjin, pp. 1–5 (2009)Google Scholar
  15. 15.
    Tsai, R.: A versatile camera calibration technique for high accuracy 3D machine vision metrology using off the shelf TV cameras and Lenses. IEEE J. Robot. Autom. 3(4), 323–344 (1987)CrossRefGoogle Scholar
  16. 16.
    Weng, J., Cohen, P., Herniou, M.: Camera calibration with distortion models and accuracy evaluation. IEEE Trans. Pattern Anal. Mach. Intell. 14, 965–980 (1992)CrossRefGoogle Scholar
  17. 17.
    Yang, Z., Chen, F., Zhao, J., et al.: A novel camera calibration method based on genetic algorithm. IEEE Conference on Industrial Electronics and Applications, Singapore, pp. 2222–2227, June 2008Google Scholar
  18. 18.
    Youngrock, Y., DeSouza, G.N., Avinash Kak, C.: Real-time tracking and pose estimation for Industrial objects using geometric features. In: Proceedings of IEEE International Conference on Robotics and Automation, Taiwan, pp. 3473–3478 (2003)Google Scholar
  19. 19.
    Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Pitchandi Nagarajan
    • 1
    Email author
  • S. Saravana Perumaal
    • 1
  • B. Yogameena
    • 2
  1. 1.Department of Mechanical EngineeringThiagarajar College of EngineeringMaduraiIndia
  2. 2.Department of Electronics and Communication EngineeringThiagarajar College of EngineeringMaduraiIndia

Personalised recommendations