Advertisement

Abstract

Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.

Keywords

Kernel Function Recognition Rate Performance Function Gaussian Kernel Recognition Performance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bach, F., Jordan, M.I.: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2003)zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12, 2385–2404 (2000)CrossRefGoogle Scholar
  3. 3.
    Belhumeur, P.N., Hepanha, J.P., Kriegman, D.J.: Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 711–720 (1997)CrossRefGoogle Scholar
  4. 4.
    Boser, B., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: Proc. of the 5th Annual Workshop on Computational Learning Theory, pp. 144–152 (1992)Google Scholar
  5. 5.
    Boothby, W.M.: An Introduction to Differential Manifolds and Riemannian Geometry. Academic Press, London (1986)Google Scholar
  6. 6.
    Hyvarinen, A.: Fast and robust fixed-point algorithm for independent component analysis. IEEE Transactions on Neural Networks 10, 626–634 (1999)CrossRefGoogle Scholar
  7. 7.
    Liu, X., Srivastava, A., Gallivan, K.: Optimal linear representations of images for object recognition. IEEE Transactions on Pattern Recognition and Machine Intelligence 26(5), 662–666 (2004)CrossRefGoogle Scholar
  8. 8.
    Mio, W., Zhang, Q., Liu, X.: Nonlinearity and optimal component analysis. In: The Proceedings of the International Conference on Neural Networks (2005)Google Scholar
  9. 9.
    Poggio, T.: On optimal nonlinear associative recall. Biological Cybernetics 19, 201–209 (1975)zbMATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)CrossRefGoogle Scholar
  11. 11.
    Srivastava, A., Liu, X.: Tools for Application-Driven Dimension Reduction. Neurocomputing (2005) (in press)Google Scholar
  12. 12.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)zbMATHGoogle Scholar
  13. 13.
    Warner, F.W.: Foundations of Differentiable Manifolds and Lie Groups. Springer, New York (1983)zbMATHGoogle Scholar
  14. 14.
    Zhang, Q., Liu, X.: Kernel optimal component analysis. In: The Proceedings of the IEEE Workshop on Learning in Computer Vision and Pattern Recognition (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Xiuwen Liu
    • 1
  • Washington Mio
    • 2
  1. 1.Department of Computer ScienceFlorida State UniversityTallahassee
  2. 2.Department of MathematicsFlorida State UniversityTallahasseeUSA

Personalised recommendations