Skip to main content

Kernel Methods for Nonlinear Discriminative Data Analysis

  • Conference paper
Book cover Energy Minimization Methods in Computer Vision and Pattern Recognition (EMMCVPR 2005)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 3757))

Abstract

Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bach, F., Jordan, M.I.: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  2. Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12, 2385–2404 (2000)

    Article  Google Scholar 

  3. Belhumeur, P.N., Hepanha, J.P., Kriegman, D.J.: Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 711–720 (1997)

    Article  Google Scholar 

  4. Boser, B., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: Proc. of the 5th Annual Workshop on Computational Learning Theory, pp. 144–152 (1992)

    Google Scholar 

  5. Boothby, W.M.: An Introduction to Differential Manifolds and Riemannian Geometry. Academic Press, London (1986)

    Google Scholar 

  6. Hyvarinen, A.: Fast and robust fixed-point algorithm for independent component analysis. IEEE Transactions on Neural Networks 10, 626–634 (1999)

    Article  Google Scholar 

  7. Liu, X., Srivastava, A., Gallivan, K.: Optimal linear representations of images for object recognition. IEEE Transactions on Pattern Recognition and Machine Intelligence 26(5), 662–666 (2004)

    Article  Google Scholar 

  8. Mio, W., Zhang, Q., Liu, X.: Nonlinearity and optimal component analysis. In: The Proceedings of the International Conference on Neural Networks (2005)

    Google Scholar 

  9. Poggio, T.: On optimal nonlinear associative recall. Biological Cybernetics 19, 201–209 (1975)

    Article  MATH  MathSciNet  Google Scholar 

  10. Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)

    Article  Google Scholar 

  11. Srivastava, A., Liu, X.: Tools for Application-Driven Dimension Reduction. Neurocomputing (2005) (in press)

    Google Scholar 

  12. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    MATH  Google Scholar 

  13. Warner, F.W.: Foundations of Differentiable Manifolds and Lie Groups. Springer, New York (1983)

    MATH  Google Scholar 

  14. Zhang, Q., Liu, X.: Kernel optimal component analysis. In: The Proceedings of the IEEE Workshop on Learning in Computer Vision and Pattern Recognition (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liu, X., Mio, W. (2005). Kernel Methods for Nonlinear Discriminative Data Analysis. In: Rangarajan, A., Vemuri, B., Yuille, A.L. (eds) Energy Minimization Methods in Computer Vision and Pattern Recognition. EMMCVPR 2005. Lecture Notes in Computer Science, vol 3757. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11585978_38

Download citation

  • DOI: https://doi.org/10.1007/11585978_38

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-30287-2

  • Online ISBN: 978-3-540-32098-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics