Minimalist Artificial Eye for Autonomous Robots and Path Planning

  • Omar Espinosa
  • Luisa Castañeda
  • Fredy MartínezEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9375)


The visual tracking is a feature of great importance for an artificial autonomous system that interfaces with the environment. It is also an open research problem of great activity in the field of computer vision. In this paper a minimalist tracking system for autonomous robots, designed primarily for navigation tasks, is proposed. The concept of minimalism translates into a processing system of very low range, but still, developing identification and tracking in real time. The proposed scheme is evaluated experimentally with a basic tracking task in which the system identifies a geometric mark on the environment, calculate its three-dimensional position and coordinates the movement of the eye in order to reduce the distance to the target and improve its focus.


Autonomous robots Bio-inspired system Object detection Object tracking 



This work was supported by the District University Francisco José de Caldas, in part through CIDC, and partly by the Technological Faculty. The views expressed in this paper are not necessarily endorsed by District University. The authors thank the research groups DIGITI and ARMOS for the evaluation carried out on prototypes of ideas and strategies, and especially Jesús David Borda Guerrero and Carlos Andrés Nieto Ortiz for the support in the development of the prototype.


  1. 1.
    Andreopoulosa, A., Tsotsosb, J.K.: 50 years of object recognition: directions forward. Comput. Vis. Image Underst. 117(8), 827–891 (2013)CrossRefGoogle Scholar
  2. 2.
    Bousnina, S., Ammar, B., Baklouti, N., Alimi, A.: Learning system for mobile robot detection and tracking. In: 2012 International Conference on Communications and Information Technology (ICCIT), pp. 384–389 (2012)Google Scholar
  3. 3.
    Chang, O., Olivares, M.: A robotic eye controller based on cooperative neural agents. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–6 (2010)Google Scholar
  4. 4.
    Dabre, K., Dholay, S.: Machine learning model for sign language interpretation using webcam images. In: 2014 International Conference on Circuits, Systems, Communication and Information Technology Applications (CSCITA), pp. 317–321 (2014)Google Scholar
  5. 5.
    Jianheng, L., Yi, L., Yingmin, J.: Visual tracking strategy of a wheeled mobile robot and a multi-dof crane equipped with a camera. In: 2013 32nd Chinese Control Conference (CCC), pp. 5489–5493 (2013)Google Scholar
  6. 6.
    Leitner, J., Forster, A., Schmidhuber, J.: Improving robot vision models for object detection through interaction. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 3355–3362 (2014)Google Scholar
  7. 7.
    Sekmen, A., Challa, P.: Assessment of adaptive humanrobot interactions. Knowl. Based Syst. 42, 1–27 (2013)CrossRefGoogle Scholar
  8. 8.
    Shibata, M., Eto, H., Ito, M.: Image-based visual tracking to fast moving target for active binocular robot. In: 36th Annual Conference on IEEE Industrial Electronics Society IECON 2010, pp. 2727–2732 (2010)Google Scholar
  9. 9.
    Shibata, M., Eto, H., Ito, M.: Visual tracking control for stereo vision robot with high gain controller and high speed cameras. In: 2011 1st International Symposium on Access Spaces (ISAS), pp. 288–293 (2011)Google Scholar
  10. 10.
    Sorribes, J., Prats, M., Morales, A.: Visual tracking of a jaw gripper based on articulated 3d models for grasping. In: 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 2302–2307 (2010)Google Scholar
  11. 11.
    Sun, Y., Ding, N., Qian, H., Xu, Y.: Real-time monocular visual self-localization approach using natural circular landmarks for indoor navigation. In: 2012 IEEE International Conference on Robotics and Biomimetics, pp. 495–500 (2012)Google Scholar
  12. 12.
    Wang, H., Liu, Y., Chen, W.: Uncalibrated visual tracking control without visual velocity. IEEE Trans. Control Syst. Technol. 18, 1359–1370 (2010)CrossRefGoogle Scholar
  13. 13.
    Yali, L., Shengjin, W., Qi, T., Xiaoqing, D.: A survey of recent advances in visual feature detection. Neurocomputing 149, 800–810 (2015)CrossRefGoogle Scholar
  14. 14.
    Zhou, Y., Luo, J., Hu, J., Li, H., Xie, S.: Bionic eye system based on fuzzy adaptive PID control. In: 2012 IEEE International Conference on Robotics and Biomimetics, pp. 1268–1272 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Omar Espinosa
    • 1
  • Luisa Castañeda
    • 1
  • Fredy Martínez
    • 1
    Email author
  1. 1.District University Francisco José de CaldasBogotáColombia

Personalised recommendations