Abstract
The use of the vision in humans is a source of inspiration for many research works in robotics. Attention mechanisms has received much of this effort, however, aspects such as gaze control and modular composition of vision capabilities have been much less analyzed. This paper describes the architecture of an active vision system that has been conceived to ease the concurrent utilization of the system by several visual tasks. We describe in detail the functional architecture of the system and provide several solutions to the problem of sharing the visual attention when several visual tasks need to be interleaved. The system’s design hides this complexity to client processes that can be designed as if they were exclusive users of the visual system. Besides, software engineering principles for design and integration, often forgotten in this kind of developments, have been considered. Some preliminary results on a real robotic platform are also provided.
This work has been partially supported by Spanish Education Ministry and FEDER (project TIN2004-07087) and Canary Islands Government (project PI2003/160).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arkin, R. (ed.): Behavior-Based Robotics. MIT Press, Cambridge (1998)
Itti, L.: Models of bottom-up attention and saliency. In: Itti, L., Rees, G., Tsotsos, J.K. (eds.) Neurobiology of Attention, Elsevier Academic Press, Amsterdam (2005)
Pellkoffer, M., Lützeler, M., Dickmanns, E.: Interaction of perception and gaze control in autonomous vehicles. In: SPIE: Intelligent Robots and Computer Vision XX: Algorithms, Techniques and Active Vision, Newton, USA (2001)
Rogers, S.D., Kadar, E.E., Costall, A.: Drivers’ gaze patterns in braking from three different approaches to a crash barrier. Ecological Psychology 17, 39–53 (2005)
Sprague, N., Ballard, D., Robinson, A.: Modeling attention with embodied visual behaviors. ACM Transactions on Applied Perception (2005)
Seara, F.J., Lorch, O., Schmidt, G.: Gaze Control for Goal-Oriented Humanoid Walking. In: Proceedings of the IEEE/RAS International Conference on Humanoid Robots (Humanoids), Tokio, Japan, pp. 187–195 (2001)
Seara, J.F., Strobl, K.H., Martin, E., Schmidt, G.: Task-oriented and Situation-dependent Gaze Control for Vision Guided Autonomous Walking. In: Proceedings of the IEEE/RAS International Conference on Humanoid Robots (Humanoids), Munich and Karlsruhe, Germany (October 2003)
Sprague, N., Ballard, D.: Eye movements for reward maximization. In: Advances in Neural Information Processing Systems, vol. 16, MIT Press, Cambridge (2003)
Christensen, H., Granum, E.: Control of perception. In: Crowley, J., Christensen, H. (eds.) Vision as Process, Springer, Heidelberg (1995)
Kushleyeva, Y., Salvucci, D.D., Lee, F.J.: Deciding when to switch tasks in time-critical multitasking. Cognitive Systems Research (2005)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Hernandez, D., Cabrera, J., Naranjo, A., Dominguez, A., Isern, J. (2007). Sharing Gaze Control in a Robotic System. In: Mira, J., Álvarez, J.R. (eds) Nature Inspired Problem-Solving Methods in Knowledge Engineering. IWINAC 2007. Lecture Notes in Computer Science, vol 4528. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73055-2_49
Download citation
DOI: https://doi.org/10.1007/978-3-540-73055-2_49
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-73054-5
Online ISBN: 978-3-540-73055-2
eBook Packages: Computer ScienceComputer Science (R0)