Skip to main content

From Saliency to Eye Gaze: Embodied Visual Selection for a Pan-Tilt-Based Robotic Head

  • Conference paper
Advances in Visual Computing (ISVC 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 6938))

Included in the following conference series:

Abstract

This paper introduces a model of gaze behavior suitable for robotic active vision. Built upon a saliency map taking into account motion saliency, the presented model estimates the dynamics of different eye movements, allowing to switch from fixational movements, to saccades and to smooth pursuit. We investigate the effect of the embodiment of attentive visual selection in a pan-tilt camera system. The constrained physical system is unable to follow the important fluctuations characterizing the maxima of a saliency map and a strategy is required to dynamically select what is worth attending and the behavior, fixation or target pursuing, to adopt. The main contributions of this work are a novel approach toward real time, motion-based saliency computation in video sequences, a dynamic model for gaze prediction from the saliency map, and the embodiment of the modeled dynamics to control active visual sensing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Treisman, A., Gelade, G.: A feature-integration theory of attention. Cognitive Psychology 12, 97–136 (1980)

    Article  Google Scholar 

  2. Tsotsos, J., Culhane, S., Wai, W., Lai, Y., Davis, N., Nuflo, F.: Modeling visual attention via selective tuning. Artifical Intelligence 78, 507–547 (1995)

    Article  Google Scholar 

  3. Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE TPAMI 20, 1254–1259 (1998)

    Article  Google Scholar 

  4. Butko, N., Movellan, J.: Infomax control of eye movements. IEEE Transactions on Autonomous Mental Development 2, 91–107 (2010)

    Article  Google Scholar 

  5. Koch, C., Ullman, S.: Shifts in selective visual-attention: towards the underlying neural circuitry. Hum. Neurobiol. 4, 219–227 (1985)

    Google Scholar 

  6. Mahadevan, V., Vasconcelos, N.: Spatiotemporal saliency in dynamic scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 171–177 (2010)

    Article  Google Scholar 

  7. Kowler, E.: Eye movements: The past 25years. Vision Research, 1–27 (2011)

    Google Scholar 

  8. Farnebäck, G.: Two-frame motion estimation based on polynomial expansion. In: Bigun, J., Gustavsson, T. (eds.) SCIA 2003. LNCS, vol. 2749, pp. 363–370. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  9. Mancas, M., Riche, N., Leroy, J., Gosselin, B.: Abnormal motion selection in crowds using bottom-up saliency. In: Proc. of the ICIP (2011)

    Google Scholar 

  10. Sauter, D., Martin, B., Di Renzo, N., Vomscheid, C.: Analysis of eye tracking movements using innovations generated by a kalman filter. Medical and Biological Engineering and Comp. (1991)

    Google Scholar 

  11. Engbert, R., Kliegl, R.: Microsaccades uncover the orientation of covert attention. Vision Research 43, 1035–1045 (2003)

    Article  Google Scholar 

  12. Komogortsev, O., Khan, J.I.: Eye movement prediction by kalman filter with integrated linear horizontal oculomotor plant mechanical model. In: ETRA, pp. 229–236 (2008)

    Google Scholar 

  13. Blom, H., Bar-Shalom, Y.: The interactive multiple model algorithm for system with markovian switching coefficients. IEEE Trans. on Automatic Control 33, 780–783 (1988)

    Article  MATH  Google Scholar 

  14. Julier, S.J., Jeffrey, Uhlmann, K.: Unscented filtering and nonlinear estimation. Proceedings of the IEEE (2004)

    Google Scholar 

  15. Julier, S.J., Uhlmann, J.K.: A new extension of the kalman filter to nonlinear systems. In: Proceedings of AeroSense: The 11th International Symposium on Aerospace/Defense Sensing, Simulation and Controls, pp. 182–193 (1997)

    Google Scholar 

  16. Wan, E., van der Merwe, R.: The unscented kalman filter for nonlinear estimation. In: Proc. of the Symposium on Adaptive Systems for Signal Processing, Communication and Control (2000)

    Google Scholar 

  17. Singer, R.A.: Estimating optimal tracking filter performance for manned maneuvering targets. IEEE Transactions on Aerospace and Electrictronic Systems (1970)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mancas, M., Pirri, F., Pizzoli, M. (2011). From Saliency to Eye Gaze: Embodied Visual Selection for a Pan-Tilt-Based Robotic Head. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2011. Lecture Notes in Computer Science, vol 6938. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24028-7_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-24028-7_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-24027-0

  • Online ISBN: 978-3-642-24028-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics