Abstract
This paper presents Modified Modulated Hebb-Oja (MHO) method that performs principal component analysis. Method is based on implementation of Time-Oriented Hierarchical Method applied on recently proposed principal subspace analysis rule called Modulated Hebb-Oja learning rule. Comparing to some other well-known methods for principal component analysis, the proposed method has one feature that could be seen as desirable from the biological point of view – synaptic efficacy learning rule does not need the explicit information about the value of the other efficacies to make individual efficacy modification. Simplicity of the “neural circuits” that perform global computations and a fact that their number does not depend on the number of input and output neurons, could be seen as good features of the proposed method. The number of necessary global calculation circuit is one. Some similarity to a part of the frog retinal circuit will be suggested, too.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Abed-Meraim, K., Attallah, V., Ckheif, A., Hua, Y.: Orthogonal Oja algorithm. IEEE Signal Processing Letters 7, 116–119 (2000)
Baldi, P., Hornik, K.: Learning in linear neural networks: A survey. IEEE Trans. Neural Networks 6, 837–858 (1995)
Bienenstock, E., Cooper, L.N., Munro, P.W.: Theory of the development of the neuron selectivity: Orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience, 32-48 (1982)
Chen, T.-P., Amari, S.-I.: Unified stabilization approach to principal and minor components. Neural Networks 14, 1377–1387 (2001)
Cichocki, A., Amari, S.–I.: Adaptive Blind Signal and Image Processing – Learning Algorithms and Applications. John Wiley and Sons, Chichester (2003)
Dowling, J.: The Retina: An approachable part of the brain. The Belknap Press of Harward University Press (1987)
Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM Journal on Matrix Analysis Applications 20, 303–353 (1998)
Douglas, S.C., Kung, S.Y., Amari, S.-I.: A self-stabilized minor subspace rule. IEEE Signal Processing Letters 5, 328–330 (1998)
Fiori, S.: A theory for learning by weight flow on Stiefel-Grassman Manifold. Neural Computation 13, 1625–1647 (2001)
Földiák, P.: Adaptive network for optimal linear feature extraction. In: IJCNN 1989, Washington, D.C., USA, pp. 401–405 (1989)
Jankovic, M.: A new modulated Hebbian learning rule – Method for local computation of a principal subspace. In: ICONIP2001, Shanghai, China, vol. 1, pp. 470–475 (2001)
Jankovic, M.: A new simple ∞OH neuron model as a biologically plausible principal component analyzer. IEEE Trans. on Neural Networks 14, 853–859 (2003)
Jankovic, M., Ogawa, H.: A new modulated Hebb learning rule – Biologically plausible method for local computation of principal subspace. Int. J. Neural Systems 13, 215–224 (2003)
Jankovic, M., Reljin, B.: Neural learning on Grassman/Stiefel principal/minor submanifold. In: EUROCON 2005, Serbia, pp. 249–252 (2005)
Jankovic, M., Ogawa, H.: Modulated Hebb-Oja learning rule – A method for principal subspace analysis. IEEE Trans. on Neural Networks 17, 345–356 (2006)
Ljung, L.: Analysis of recursive stochastic algorithms. IEEE Trans. Automat. Contr. 22, 551–575 (1977)
Möller, R., Konies, A.: Coupled principal component analysis. IEEE Trans. on Neural Networks 15, 214–222 (2004)
Oja, E.: A simplified neuron model as a principal component analyzer. J. Math. Biol. 15, 267–273 (1982)
Oja, E.: Subspace Method of Pattern Recognition. Research Studies Press and J. Wiley, Letchworth (1983)
Oja, E., Karhunen, J.: On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix. J. Math. Anal., Appl. 106, 69–84 (1985)
Oja, E., Ogawa, H., Wangviwattana, J.: Principal component analysis by homogeneous neural networks, Part I: The weighted subspace criterion. IEICE Trans. Inf.&Syst. E75-D, 366–375 (1992)
Peltonen, J., Kaski, S.: Discriminative components of data. IEEE Trans. on Neural Networks 16, 68–83 (2005)
Plumbley, M.D., Oja, E.: A “nonnegative PCA” algorithm for independent component analysis. IEEE Trans. on Neural Networks 15, 66–76 (2004)
Tanaka, T.: Generalized weighted rules for principal components tracking. IEEE Trans. on Signal Processing 53, 1243–1253 (2005)
Waheed, K., Salem, F.M.: Blind information-theoretic multiuser detection algorithms for DS-CMDA and WCDMA downlink systems. IEEE Trans. on Neural Networks 16, 937–948 (2005)
Xu, L.: Least mean square error reconstruction principle for self-organizing neural nets. Neural Networks 6, 627–648 (1993)
Yang, B.: Projection approximation subspace tracking. IEEE Trans. on Signal Processing 43, 95–107 (1995)
Zhang, Y., Weng, J., Hwang, W.-S.: Auditory learning: a developmental method. IEEE Trans. on Neural Networks 16, 601–616 (2005)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jankovic, M., Martinez, P., Chen, Z., Cichocki, A. (2008). Modified Modulated Hebb-Oja Learning Rule: A Method for Biologically Plausible Principal Component Analysis. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4984. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69158-7_55
Download citation
DOI: https://doi.org/10.1007/978-3-540-69158-7_55
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-69154-9
Online ISBN: 978-3-540-69158-7
eBook Packages: Computer ScienceComputer Science (R0)