4 × 2D Visual Servoing Approach for Advanced Robot Applications

  • Mohamad BdiwiEmail author
  • Jozef Suchý
  • Matthias Putz
Part of the Studies in Systems, Decision and Control book series (SSDC, volume 175)


Actually, vision information can be used as sensory input in open-loop as well as in the closed-loop. However, the visual servoing approach is performed only inside the closed-loop robot control, because in the open-loop the vision sensor will represent the initial extraction of the features to generate directly the robot motion sequence and these features and motion could be off-line generated. On the contrast, closed-loop robot system uses the vision as real time sensor and it consists of two phases: tracking and control. Tracking provides a continuous estimation and update of features during the robot/object motion. Based on this information, a real time control loop will be generated. The main contribution of this work is a proposed visual servoing approach which will benefit from the images which are obtained by Kinect camera (RGB-D camera). The proposed visual servoing approach is called 4 × 2D visual servoing which combines the correspondent color and depth images to build two new images. Using these 4 images the control error signals will be calculated in order to track the objects. Firstly, this chapter will present all types of visual servoing then it will introduce the 4 × 2D visual servoing approach and the visible side coordinate system, after that it will illustrate the concept of the proposed approach and how the error signal will be calculated. In addition to that, this approach proposes a coordinate system which is called visible side coordinate system.


Visual servoing Robot vision RGB-D camera systems 


  1. Bdiwi, M. (2014). Development of integration algorithms for vision/force robot control with automatic decision system. Dissertation, Technical University of Chemnitz, Germany.Google Scholar
  2. Bdiwi, M., & Suchý, J. (2012). Library automation using different structures of vision–force robot control and automatic decision system. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Portugal.Google Scholar
  3. Bdiwi, M., & Suchý, J. (2014). Integration of vision/force robot control using automatic decision system for performing different successive tasks. In Processing on 8th German Conference on Robotics.Google Scholar
  4. Bdiwi, M., Kolker, A., Suchý J., & Winkler A. (2013). Real time visual and force servoing of human hand for physical human–robot interaction: Handing–over unknown objects. In Workshop on Human Robot Interaction for Assistance and Industrial Robots in IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany.Google Scholar
  5. Bien, Z., Jang, W., & Park, J. (1993). Characterization and use of feature–Jacobian matrix for visual servoing. In K. Hashimoto (Ed.), In visual servoing (pp. 317–363). Singapore: World Scientific.CrossRefGoogle Scholar
  6. Chaumette, F. (2002). A first step toward visual servoing using image moments. In IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 378–383).Google Scholar
  7. Chaumette, F. (2004). Image moments: A general and useful set of features for visual servoing. IEEE Transactions on Robotics, 20(4), 713–723.CrossRefGoogle Scholar
  8. Chesi, G., & Hashimoto, K. (2002). Static–eye against hand eye visual servoing. In Proceedings of 41st IEEE Conference on Decision and Control (pp. 2854–2859).Google Scholar
  9. Corke, P., & Hutchinson, S. (2000). Real–time vision, tracking and control. In Proceedings of IEEE International Conference on Robotics and Automation (pp. 622–629).Google Scholar
  10. Corke, P., & Hutchinson, S. (2001). A new partitioned approach to image–based visual servoing control. IEEE Transactions on Robotics and Automation, 17(4), 507–515.CrossRefGoogle Scholar
  11. Espiau, B., Chaumette, F., & Rives, P. (1992). A new approach to visual servoing in robotics. IEEE Transactions on Robotics and Automation, 8(3), 313–326.CrossRefGoogle Scholar
  12. Flandin, G., Chaumette, F., & Marchand, E. (2000). Eye-in-hand/Eye-to-hand cooperation for visual servoing. In IEEE International Conference on Robotics and Automation (pp. 2741–2746).Google Scholar
  13. Goncalves, P., Ferreira, P., & Pinto, P. (2002). Comparing visual servoing architectures for a planar robot. In Proceedings of the 10th Mediterranean Conference on Control and Automation, Portugal.Google Scholar
  14. Hashimoto, K. (2003). A review on vision–based control of robot manipulators. Advanced Robotics, 17, 969–991.CrossRefGoogle Scholar
  15. Hill, J., & Park, W. (1979). Real time control of a robot with a mobile camera. In Proceedings of the 9th SRI International, USA (pp. 233–246).Google Scholar
  16. Hutchinson, S., Hager, G., & Corke, P. (1996). A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 1, 651–670.CrossRefGoogle Scholar
  17. Kolker, A., Winkler, A., Bdiwi, M., & Suchý, J. (2013). Robot visual servoing using the example of the inverted pendulum. In 10th IEEE International Multi–Conference on Systems, Signals & Devices (SSD), Hammamet, Tunesien.Google Scholar
  18. Kragic, D., & Christensen, H. (2002). Survey on visual servoing for manipulation. Computational Vision and Active Perception Laboratory, Sweden.zbMATHGoogle Scholar
  19. Malis, E. (2002). Survey of vision–based robot control. In European Naval Ship Design, Captain Computer IV Forum, France (pp. 1–16).Google Scholar
  20. Malis, E., Chaumette, F., & Boudet, S. (1999). \(2\frac {1}{2}\mathrm {D}\) visual servoing. IEEE International Transactions on Robotics and Automation, 15, 238–250.CrossRefGoogle Scholar
  21. Prokop, R., & Reeves, A. (1992). A survey of moment–based techniques for unoccluded object representation and recognition. Proceedings Computer Vision, Graphics, Image Processing Conference, 54, 438–460.Google Scholar
  22. Shirai, Y., & Inoue, H. (1973). Guiding a robot by visual feedback in assembling tasks. Pattern Recognition Pergamon Press, 5, 99–108. Great Britain.Google Scholar
  23. Weiss, L., Sanderson, A., & Neuman, C. (1985). Dynamic visual servo control of robots: An adaptive image–based approach. In Proceedings of IEEE International Conference on Robotics and Automation (pp. 662–668).Google Scholar
  24. Weiss, L., Sanderson, A., & Neuman, C. (1987). Dynamic sensor based control of robots with visual feedback. IEEE Journal of Robotics and Automation, 5, 404–417.CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Robotics Department, Fraunhofer IWUChemnitzGermany
  2. 2.Technical University of ChemnitzChemnitzGermany

Personalised recommendations