Advertisement

Unicycle-like Robots with Eye-in-Hand Monocular Cameras: From PBVS towards IBVS

  • Daniele Fontanelli
  • Paolo Salaris
  • Felipe A. W. Belo
  • Antonio Bicchi
Part of the Lecture Notes in Control and Information Sciences book series (LNCIS, volume 401)

Abstract

This chapter presents an introduction to current research devoted to the visual servoing problem of guiding differentially driven robots, more specifically, unicycle-like vehicles, taking into consideration limited field of view (FOV) constraints. The goal is to carry out accurate servoing of the vehicle to a desired posture using only feedback from an on-board camera. First, a position based scheme is proposed, adopting a hybrid control law to cope with limited camera aperture. This scheme relies on a localization method based on extended Kalman filter (EKF) technique that takes into account the robot motion model and odometric data. To increase the potentiality of the visual servoing scheme with respect to existing solutions, which achieve similar goals locally (i.e., when the desired and actual camera views are sufficiently similar), the proposed method visually navigate the robot through an extended visual map before eventually reaching the desired goal. The map construction is part of the approach proposed here, which is then called visual simultaneous localization and mapping (VSLAM) for servoing. Position based scheme accuracy are intrinsically related to the effectiveness of the localization process, which is related to the estimation of 3D information on both the robot and the environment. A shortcut overcoming the estimation process uses visual information directly in the image domain. In this spirit, an image based scheme is presented. The controller is devoted to constantly track desired image feature trajectories. Such trajectories represent optimal (shortest) paths for the vehicle from the 3D initial position towards the desired one. Optimal trajectories satisfies the additional constraint of keeping a feature in sight of the camera and induces a taxonomy of the robot plane of motion into regions. It follows that the robot uses only visual data to determine the region to which it belongs and, hence, the associated optimal path. Similarly to the previous case, the visual scheme effectiveness is improved adopting appearance based image maps.

Keywords

Mobile Robot Optimal Path Extended Kalman Filter Optimal Trajectory Scale Invariant Feature Transform 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Benhimane, S., Malis, E.: A new approach to vision–based robot control with omni–directional cameras. In: Proc. IEEE Int. Conf. on Robotics and Automation, Orlando, Florida, USA, pp. 526–531 (2006)Google Scholar
  2. 2.
    Bhattacharya, S., Murrieta-Cid, R., Hutchinson, S.: Optimal paths for landmark–based navigation by differential–drive vehicles with field–of–view constraints. IEEE Transactions on Robotics 23(1), 47–59 (2007)CrossRefGoogle Scholar
  3. 3.
    Brockett, R.: Asymptotic stability and feedback stabilization. In: Brockett, M. (ed.) Differential Geometric Control Theory, pp. 181–191. Birkhauser, Boston (1983)Google Scholar
  4. 4.
    Chaumette, F., Hutchinson, S.: Visual servo control, Part I: Basic approaches. IEEE Robotics and Automation Magazine 13(4), 82–90 (2006)CrossRefGoogle Scholar
  5. 5.
    Chaumette, F., Hutchinson, S.: Visual servo control, Part II: Advanced approaches. IEEE Robotics and Automation Magazine 14(1), 109–118 (2007)CrossRefGoogle Scholar
  6. 6.
    Chen, J., Dawson, D., Dixon, W., Chitrakaran, V.: Navigation function-based visual servo control. Automatica 43(7), 1165–1177 (2007)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Chesi, G., Hashimoto, K., Prattichizzo, D., Vicino, A.: A swiching control law for keeping features in the field of view in eye-in-hand visual servoing. In: Proc. IEEE Int. Conf. on Robotics and Automation, Taipei, Taiwan, pp. 3929–3934 (2003)Google Scholar
  8. 8.
    Chesi, G., Hung, Y.: Global path-planning for constrained and optimal visual servoing. IEEE Transactions on Robotics 23(5), 1050–1060 (2007)CrossRefGoogle Scholar
  9. 9.
    Chesi, G., Hung, Y.: Visual servoing: a global path-planning approach. In: Proc. IEEE Intl. Conf. on Robotics and Automation, Roma, Italy, pp. 2086–2091 (2007)Google Scholar
  10. 10.
    Chiuso, A., Favaro, P., Jin, H., Soatto, S.: Structure from motion casually integrated over time. IEEE Trans. on Pattern Analysis and Machine Intelligence 24(4), 523–535 (2002)CrossRefGoogle Scholar
  11. 11.
    Collewet, C., Chaumette, F.: Positioning a camera with respect to planar objects of unknown shape by coupling 2-d visual servoing and 3-d estimations. IEEE Trans. on Robotics and Automation 18(3) (2002)Google Scholar
  12. 12.
    Davison, A.J.: Real–time simultaneous localization and mapping with a single camera. In: Proc. IEEE Int. Conf. on Computer Vision, vol. 2, pp. 1403–1410 (2003)Google Scholar
  13. 13.
    Deguchi, K.: Optimal motion control for image-based visual servoing by decoupling translation and rotation. In: IEEE/RJS Intl. Conf. on Intelligent Robots and Systems, Victoria, B.C., Canada, pp. 705–711 (1998)Google Scholar
  14. 14.
    Fontanelli, D., Danesi, A., Belo, F., Salaris, P., Bicchi, A.: Visual Servoing in the Large. The International Journal of Robotics Research 28(6), 802–814 (2009)CrossRefGoogle Scholar
  15. 15.
    Fontanelli, D., Salaris, P., Belo, F., Bicchi, A.: Visual appearance mapping for optimal vision based servoing. In: Experimental Robotics. Springer Tracts in Advanced Robotics (STAR), vol. 54, pp. 353–362. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  16. 16.
    Gans, N., Hutchinson, S.: Stable visual servoing through hybrid switched system control. IEEE Transactions on Robotics 23(3), 530–540 (2007)CrossRefGoogle Scholar
  17. 17.
    Goncalves, L., DiBernardo, E., Benson, D., Svedman, M., Ostrowski, J., Karlsson, N., Pirjanian, P.: A visual front–end for simultaneous localization and mapping. In: Proc. IEEE Int. Conf. on Robotics and Automation, Barcelona, Spain, pp. 44–49 (2005)Google Scholar
  18. 18.
    Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2003)Google Scholar
  19. 19.
    Karlsson, N., DiBernardo, E., Ostrowski, J., Goncalves, L., Pirjanian, P., Munich, M.E.: The vSLAM algorithm for robust localization and mapping. In: Proc. IEEE Int. Conf. on Robotics and Automation, Barcelona, Spain, pp. 24–29 (2005)Google Scholar
  20. 20.
    López-Nicolás, G., Bhattacharya, S., Guerrero, J., Sagüés, C., Hutchinson, S.: Switched homography-based visual control of differential drive vehicles with field-of-view constraints. In: Proc. IEEE Int. Conf. on Robotics and Automation, Rome, pp. 4238–4244 (2007)Google Scholar
  21. 21.
    Lowe, D.: Distinctive image features from scale-invariant keypoints. Int. Jour. of Computer Vision 20, 91–110 (2003)Google Scholar
  22. 22.
    Lu, L., Dai, X., Hager, G.D.: Efficient particle filtering using RANSAC with application to 3D face tracking. Image and Vision Computing 24(6), 581–592 (2006)CrossRefGoogle Scholar
  23. 23.
    Mariottini, G.L., Oriolo, G., Prattichizzo, D.: Image–based visual servoing for nonholonomic mobile robots using epipolar geometry. IEEE Transactions on Robotics 23(1), 87–100 (2007)CrossRefGoogle Scholar
  24. 24.
    Mezouar, Y., Chaumette, F.: Path planning for robust image–based control. IEEE Transactions on Robotics and Automation 18(4), 534–549 (2002)CrossRefGoogle Scholar
  25. 25.
    Mezouar, Y., Remazeilles, A., Gros, P., Chaumette, F.: Images interpolation for image–based control under large displacement. In: Proc. IEEE Int. Conf. on Robotics and Automation, Washington, DC, pp. 3787–3794 (2002)Google Scholar
  26. 26.
    Murrieri, P., Fontanelli, D., Bicchi, A.: A hybrid-control approach to the parking problem of a wheeled vehicle using limited view-angle visual feedback. Int. Jour. of Robotics Research 23(4-5), 437–448 (2004)CrossRefGoogle Scholar
  27. 27.
    Rekleitis, I., Sim, R., Dudek, G., Milios, E.: Collaborative exploration for the construction of visual maps. In: Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1269–1274 (2001)Google Scholar
  28. 28.
    Remazeilles, A., Chaumette, F., Gros, P.: Robot motion control from a visual memory. In: Proc. IEEE Int. Conf. on Robotics and Automation, vol. 5, pp. 4695–4700 (2004)Google Scholar
  29. 29.
    Royer, E., Bom, J., Dhome, M., Thuilot, B., Lhuillier, M., Marmoiton, F.: Outdoor autonomous navigation using monocular vision. In: Proc. of the IEEE/RSJ Conference on Intelligent Robots and Systems, pp. 1253–1258 (2005)Google Scholar
  30. 30.
    Salaris, P., Belo, F., Fontanelli, D., Greco, L., Bicchi, A.: Optimal paths in a constrained image plane for purely image–based parking. In: Proc. IEEE Int. Symp. Intelligent Robots and Systems, Nice, France, pp. 1673–1680 (2008)Google Scholar
  31. 31.
    Sanderson, A.C., Weiss, L.E.: Image–based visual servo control using relational graph error signals. In: Proc. IEEE Int. Conf. on Robotics and Automation, pp. 1074–1077 (1980)Google Scholar
  32. 32.
    Vedaldi, A., Jin, H., Favaro, P., Soatto, S.: KALMANSAC: Robust filtering by consensus. In: Proc. of the Int. Conf. on Computer Vision (ICCV), vol. 1, pp. 633–640 (2005)Google Scholar

Copyright information

© Springer London 2010

Authors and Affiliations

  • Daniele Fontanelli
    • 1
  • Paolo Salaris
    • 2
  • Felipe A. W. Belo
    • 2
  • Antonio Bicchi
    • 2
  1. 1.Department of Information Engineering and Computer ScienceUniversity of TrentoPovoItaly
  2. 2.Department of Electrical Systems and AutomationUniversity of PisaPisaItaly

Personalised recommendations