Skip to main content

Modified Modulated Hebb-Oja Learning Rule: A Method for Biologically Plausible Principal Component Analysis

  • Conference paper
Neural Information Processing (ICONIP 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4984))

Included in the following conference series:

Abstract

This paper presents Modified Modulated Hebb-Oja (MHO) method that performs principal component analysis. Method is based on implementation of Time-Oriented Hierarchical Method applied on recently proposed principal subspace analysis rule called Modulated Hebb-Oja learning rule. Comparing to some other well-known methods for principal component analysis, the proposed method has one feature that could be seen as desirable from the biological point of view – synaptic efficacy learning rule does not need the explicit information about the value of the other efficacies to make individual efficacy modification. Simplicity of the “neural circuits” that perform global computations and a fact that their number does not depend on the number of input and output neurons, could be seen as good features of the proposed method. The number of necessary global calculation circuit is one. Some similarity to a part of the frog retinal circuit will be suggested, too.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abed-Meraim, K., Attallah, V., Ckheif, A., Hua, Y.: Orthogonal Oja algorithm. IEEE Signal Processing Letters 7, 116–119 (2000)

    Article  Google Scholar 

  2. Baldi, P., Hornik, K.: Learning in linear neural networks: A survey. IEEE Trans. Neural Networks 6, 837–858 (1995)

    Article  Google Scholar 

  3. Bienenstock, E., Cooper, L.N., Munro, P.W.: Theory of the development of the neuron selectivity: Orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience, 32-48 (1982)

    Google Scholar 

  4. Chen, T.-P., Amari, S.-I.: Unified stabilization approach to principal and minor components. Neural Networks 14, 1377–1387 (2001)

    Article  Google Scholar 

  5. Cichocki, A., Amari, S.–I.: Adaptive Blind Signal and Image Processing – Learning Algorithms and Applications. John Wiley and Sons, Chichester (2003)

    Google Scholar 

  6. Dowling, J.: The Retina: An approachable part of the brain. The Belknap Press of Harward University Press (1987)

    Google Scholar 

  7. Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM Journal on Matrix Analysis Applications 20, 303–353 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  8. Douglas, S.C., Kung, S.Y., Amari, S.-I.: A self-stabilized minor subspace rule. IEEE Signal Processing Letters 5, 328–330 (1998)

    Article  Google Scholar 

  9. Fiori, S.: A theory for learning by weight flow on Stiefel-Grassman Manifold. Neural Computation 13, 1625–1647 (2001)

    Article  MATH  Google Scholar 

  10. Földiák, P.: Adaptive network for optimal linear feature extraction. In: IJCNN 1989, Washington, D.C., USA, pp. 401–405 (1989)

    Google Scholar 

  11. Jankovic, M.: A new modulated Hebbian learning rule – Method for local computation of a principal subspace. In: ICONIP2001, Shanghai, China, vol. 1, pp. 470–475 (2001)

    Google Scholar 

  12. Jankovic, M.: A new simple ∞OH neuron model as a biologically plausible principal component analyzer. IEEE Trans. on Neural Networks 14, 853–859 (2003)

    Article  MathSciNet  Google Scholar 

  13. Jankovic, M., Ogawa, H.: A new modulated Hebb learning rule – Biologically plausible method for local computation of principal subspace. Int. J. Neural Systems 13, 215–224 (2003)

    Article  Google Scholar 

  14. Jankovic, M., Reljin, B.: Neural learning on Grassman/Stiefel principal/minor submanifold. In: EUROCON 2005, Serbia, pp. 249–252 (2005)

    Google Scholar 

  15. Jankovic, M., Ogawa, H.: Modulated Hebb-Oja learning rule – A method for principal subspace analysis. IEEE Trans. on Neural Networks 17, 345–356 (2006)

    Article  Google Scholar 

  16. Ljung, L.: Analysis of recursive stochastic algorithms. IEEE Trans. Automat. Contr. 22, 551–575 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  17. Möller, R., Konies, A.: Coupled principal component analysis. IEEE Trans. on Neural Networks 15, 214–222 (2004)

    Article  Google Scholar 

  18. Oja, E.: A simplified neuron model as a principal component analyzer. J. Math. Biol. 15, 267–273 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  19. Oja, E.: Subspace Method of Pattern Recognition. Research Studies Press and J. Wiley, Letchworth (1983)

    Google Scholar 

  20. Oja, E., Karhunen, J.: On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix. J. Math. Anal., Appl. 106, 69–84 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  21. Oja, E., Ogawa, H., Wangviwattana, J.: Principal component analysis by homogeneous neural networks, Part I: The weighted subspace criterion. IEICE Trans. Inf.&Syst. E75-D, 366–375 (1992)

    Google Scholar 

  22. Peltonen, J., Kaski, S.: Discriminative components of data. IEEE Trans. on Neural Networks 16, 68–83 (2005)

    Article  Google Scholar 

  23. Plumbley, M.D., Oja, E.: A “nonnegative PCA” algorithm for independent component analysis. IEEE Trans. on Neural Networks 15, 66–76 (2004)

    Article  Google Scholar 

  24. Tanaka, T.: Generalized weighted rules for principal components tracking. IEEE Trans. on Signal Processing 53, 1243–1253 (2005)

    Article  Google Scholar 

  25. Waheed, K., Salem, F.M.: Blind information-theoretic multiuser detection algorithms for DS-CMDA and WCDMA downlink systems. IEEE Trans. on Neural Networks 16, 937–948 (2005)

    Article  Google Scholar 

  26. Xu, L.: Least mean square error reconstruction principle for self-organizing neural nets. Neural Networks 6, 627–648 (1993)

    Article  Google Scholar 

  27. Yang, B.: Projection approximation subspace tracking. IEEE Trans. on Signal Processing 43, 95–107 (1995)

    Article  MATH  Google Scholar 

  28. Zhang, Y., Weng, J., Hwang, W.-S.: Auditory learning: a developmental method. IEEE Trans. on Neural Networks 16, 601–616 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Masumi Ishikawa Kenji Doya Hiroyuki Miyamoto Takeshi Yamakawa

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jankovic, M., Martinez, P., Chen, Z., Cichocki, A. (2008). Modified Modulated Hebb-Oja Learning Rule: A Method for Biologically Plausible Principal Component Analysis. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4984. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69158-7_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69158-7_55

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69154-9

  • Online ISBN: 978-3-540-69158-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics