Minimalist Artificial Eye for Autonomous Robots and Path Planning
The visual tracking is a feature of great importance for an artificial autonomous system that interfaces with the environment. It is also an open research problem of great activity in the field of computer vision. In this paper a minimalist tracking system for autonomous robots, designed primarily for navigation tasks, is proposed. The concept of minimalism translates into a processing system of very low range, but still, developing identification and tracking in real time. The proposed scheme is evaluated experimentally with a basic tracking task in which the system identifies a geometric mark on the environment, calculate its three-dimensional position and coordinates the movement of the eye in order to reduce the distance to the target and improve its focus.
KeywordsAutonomous robots Bio-inspired system Object detection Object tracking
This work was supported by the District University Francisco José de Caldas, in part through CIDC, and partly by the Technological Faculty. The views expressed in this paper are not necessarily endorsed by District University. The authors thank the research groups DIGITI and ARMOS for the evaluation carried out on prototypes of ideas and strategies, and especially Jesús David Borda Guerrero and Carlos Andrés Nieto Ortiz for the support in the development of the prototype.
- 2.Bousnina, S., Ammar, B., Baklouti, N., Alimi, A.: Learning system for mobile robot detection and tracking. In: 2012 International Conference on Communications and Information Technology (ICCIT), pp. 384–389 (2012)Google Scholar
- 3.Chang, O., Olivares, M.: A robotic eye controller based on cooperative neural agents. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–6 (2010)Google Scholar
- 4.Dabre, K., Dholay, S.: Machine learning model for sign language interpretation using webcam images. In: 2014 International Conference on Circuits, Systems, Communication and Information Technology Applications (CSCITA), pp. 317–321 (2014)Google Scholar
- 5.Jianheng, L., Yi, L., Yingmin, J.: Visual tracking strategy of a wheeled mobile robot and a multi-dof crane equipped with a camera. In: 2013 32nd Chinese Control Conference (CCC), pp. 5489–5493 (2013)Google Scholar
- 6.Leitner, J., Forster, A., Schmidhuber, J.: Improving robot vision models for object detection through interaction. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 3355–3362 (2014)Google Scholar
- 8.Shibata, M., Eto, H., Ito, M.: Image-based visual tracking to fast moving target for active binocular robot. In: 36th Annual Conference on IEEE Industrial Electronics Society IECON 2010, pp. 2727–2732 (2010)Google Scholar
- 9.Shibata, M., Eto, H., Ito, M.: Visual tracking control for stereo vision robot with high gain controller and high speed cameras. In: 2011 1st International Symposium on Access Spaces (ISAS), pp. 288–293 (2011)Google Scholar
- 10.Sorribes, J., Prats, M., Morales, A.: Visual tracking of a jaw gripper based on articulated 3d models for grasping. In: 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 2302–2307 (2010)Google Scholar
- 11.Sun, Y., Ding, N., Qian, H., Xu, Y.: Real-time monocular visual self-localization approach using natural circular landmarks for indoor navigation. In: 2012 IEEE International Conference on Robotics and Biomimetics, pp. 495–500 (2012)Google Scholar
- 14.Zhou, Y., Luo, J., Hu, J., Li, H., Xie, S.: Bionic eye system based on fuzzy adaptive PID control. In: 2012 IEEE International Conference on Robotics and Biomimetics, pp. 1268–1272 (2012)Google Scholar