Skip to main content

Non-Euclidean Principal Component Analysis and Oja’s Learning Rule – Theoretical Aspects

  • Conference paper
Book cover Advances in Self-Organizing Maps

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 198))

Abstract

Principal component analysis based on Hebbian learning is originally designed for data processing in Euclidean spaces. We present in this contribution an extension of Oja’s online learning approach for non-Euclidean spaces. First we review the kernel principal component approach. We show that for differentiable kernel this approach can be formulated as an online learning scheme. Hence, PCA can be explicitly carried out in the data space but now equipped with a non-Euclidean metric. Moreover, the theoretical framework can be extended to principal component learning in Banach spaces based on semi-inner products. This becomes particularly important when learning in l p -norm spaces with p ≠2 is considered. In this contribution we focus on the mathematics and theoretical justification of the approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aronszajn, N.: Theory of reproducing kernels. Transactions of the American Mathematical Society 68, 337–404 (1950)

    Article  MathSciNet  MATH  Google Scholar 

  2. Der, R., Lee, D.: Large-margin classification in Banach spaces. In: JMLR Workshop and Conference Proceedings. AISTATS, vol. 2, pp. 91–98 (2007)

    Google Scholar 

  3. Giles, J.: Classes of semi-inner-product spaces. Transactions of the American Mathematical Society 129, 436–446 (1967)

    Article  MathSciNet  MATH  Google Scholar 

  4. Günter, S., Schraudolph, N., Vishwanathan, S.: Fast iterative kernel principal component analysis. Journal of Machine Learning Research 8, 1893–1918 (2007)

    MATH  Google Scholar 

  5. Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Networks 15(8-9), 1059–1068 (2002)

    Article  Google Scholar 

  6. Haykin, S.: Neural Networks. A Comprehensive Foundation. Macmillan, New York (1994)

    MATH  Google Scholar 

  7. Hein, M., Bousquet, O., Schölkopf, B.: Maximal margin classification for metric spaces. Journal of Computer Systems Sciences 71, 333–359 (2005)

    Article  MATH  Google Scholar 

  8. Hoffmann, T., Schölkopf, B., Smola, A.: Kernel methods in machine learning. The Annals of Statistics 36(3), 1171–1220 (2008)

    Article  MathSciNet  Google Scholar 

  9. Joliffe, I.: Principal Component Analysis, 2nd edn. Springer (2002)

    Google Scholar 

  10. Kästner, M., Hammer, B., Biehl, M., Villmann, T.: Functional relevance learning in generalized learning vector quantization. Neurocomputing 90(9), 85–95 (2012)

    Article  Google Scholar 

  11. Kim, K., Franz, M., Schölkopf, B.: Kernel hebbian algorithm for iterative kernel principal component analysis. Technical Report 109, Max-Planck-Institute for Biological Cybernetics (June 2003)

    Google Scholar 

  12. Kim, K., Franz, M., Schölkopf, B.: Iterative kernel principal component analysis for image modelling. IEEE Transactions on Pat. 27(9), 1351–1366 (2005)

    Google Scholar 

  13. Lumer, G.: Semi-inner-product spaces. Transactions of the American Mathematical Society 100, 29–43 (1961)

    Article  MathSciNet  MATH  Google Scholar 

  14. Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society, London, A 209, 415–446 (1909)

    Article  MATH  Google Scholar 

  15. Oja, E.: Neural networks, principle components and suspaces. International Journal of Neural Systems 1, 61–68 (1989)

    Article  MathSciNet  Google Scholar 

  16. Oja, E.: Nonlinear pca: Algorithms and applications. In: Proc. of the World Congress on Neural Networks, Portland, pp. 396–400 (1993)

    Google Scholar 

  17. Sanger, T.: Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Networks 12, 459–473 (1989)

    Article  Google Scholar 

  18. Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press (2002)

    Google Scholar 

  19. Schneider, P., Hammer, B., Biehl, M.: Adaptive relevance matrices in learning vector quantization. Neural Computation 21, 3532–3561 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  20. Schölkopf, B., Smola, A., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neur. 14(7), 1299–1319 (1998)

    Google Scholar 

  21. Steinwart, I.: On the influence of the kernel on the consistency of support vector machines. Journal of Machine Learning Research 2, 67–93 (2001)

    MathSciNet  Google Scholar 

  22. Villmann, T., Haase, S.: A note on gradient based learning in vector quantization using differentiable kernels for Hilbert and Banach spaces. Machine Learning Reports 6(MLR-02-2012), 1–29 (2012) ISSN:1865-3960, http://www.techfak.uni-bielefeld.de/fschleif/mlr/mlr_02_2012.pdf

  23. Villmann, T., Hammer, B.: Functional Principal Component Learning Using Oja’s Method and Sobolev Norms. In: Príncipe, J.C., Miikkulainen, R. (eds.) WSOM 2009. LNCS, vol. 5629, pp. 325–333. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  24. von Luxburg, U., Bousquet, O.: Distance-based classification with Lipschitz functions. Journal of Machine Learning Research 5, 669–695 (2004)

    MATH  Google Scholar 

  25. Zhang, H., Xu, Y., Zhang, J.: Reproducing kernel banach spaces for machine learning. Journal of Machine Learning Research 10, 2741–2775 (2009)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Biehl, M., Kästner, M., Lange, M., Villmann, T. (2013). Non-Euclidean Principal Component Analysis and Oja’s Learning Rule – Theoretical Aspects. In: Estévez, P., Príncipe, J., Zegers, P. (eds) Advances in Self-Organizing Maps. Advances in Intelligent Systems and Computing, vol 198. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35230-0_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-35230-0_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-35229-4

  • Online ISBN: 978-3-642-35230-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics