Advertisement

Biologically Inspired Saliency Map Model for Bottom-up Visual Attention

  • Sang-Jae Park
  • Jang-Kyoo Shin
  • Minho Lee
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2525)

Abstract

In this paper, we propose a new saliency map model to find a selective attention region in a static color image for human-like fast scene analysis. We consider the roles of cells in our visual receptor for edge detection and cone opponency, and also reflect the roles of the lateral geniculate nucleus to find a symmetrical property of an interesting object such as shape and pattern. Also, independent component analysis (ICA) is used to find a filter that can generate a salient region from feature maps constructed by edge, color opponency and symmetry information, which models the role of redundancy reduction in the visual cortex. Computer experimental results show that the proposed model successfully generates the plausible sequence of salient region.

Keywords

Visual Cortex Independent Component Analysis Independent Component Analysis Lateral Geniculate Nucleus Salient Region 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Patt. Anal. Mach. Intell. 20(11) (1998) 1254–1259CrossRefGoogle Scholar
  2. 2.
    Yagi, T., Asano, N., Makita, S., Uchikawa, Y.: Active vision inspired by mammalian fixation mechanism. Intelligent Robots and Systems (1995) 39–47Google Scholar
  3. 3.
    Treisman, Gelde, G., A. M.: A feature-intergation theory of attention. Cognitive Psychology 12(1) (1980) 97–136CrossRefGoogle Scholar
  4. 4.
    Barlow, H. B., Tolhust, D. J.: Why do you have edge detectors? Optical society of America Technical Digest. 23 (1992) 172Google Scholar
  5. 5.
    Bell, A. J., Sejnowski, T. J.: The independent components of natural scenes are edge filters. Vision Research. 37 (1997) 3327–3338CrossRefGoogle Scholar
  6. 6.
    Buchsbaum, G., Gottschalk, A.: Trichromacy, opponent colours coding and optimum colour information transmission in the retina. Proc. R. Soc. London Ser. B 220 (1983) 89–113CrossRefGoogle Scholar
  7. 7.
    Wachtler, T., Lee, T. W., Sejnowski, T. J.: Chromatic structure of natural scenes. J. Opt. Soc. Am. A, Vol. 18 (2001) No. 1Google Scholar
  8. 8.
    Bruce Goldstein E.: Sensation and Perception. 4th edn. An international Thomson publishing company, USA (1995)Google Scholar
  9. 9.
    Majani, E., Erlanson, R., Abu-Mostafa, Y.: The Eye. Academic, New York (1984)Google Scholar
  10. 10.
    Kuffler, S. W., Nicholls, J. G., Martin, J. G.: From Neuron to Brain. Sinauer Associates, Sunderland, U.K (1984)Google Scholar
  11. 11.
    Gonzalez, R. G., Woods, R. E.: Digital Image Processing. Addison-Wesley Publishing Company, USA (1993)Google Scholar
  12. 12.
    Lee, T. W.: Independent Component Analysis-theory and application. Kluwer academic publisher, (1998)Google Scholar
  13. 13.
    Seo, K. S., Park, C.J., Cho, S.H., Choi, H. M.: Context-Free Marker-Controlled Watershed Transform for Efficient Multi-Object Detection and Segmentation. IEICE TRANS. Vol. E84-A. Jun. (2001) No. 6Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Sang-Jae Park
    • 1
  • Jang-Kyoo Shin
    • 2
  • Minho Lee
    • 2
  1. 1.Dept. of Sensor EngineeringKyungpook National UniversityTaeguKorea
  2. 2.School of Electronic and Electrical EngineeringKyungpook National UniversityTaeguKorea

Personalised recommendations