Kernel Methods for Nonlinear Discriminative Data Analysis
Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.
KeywordsKernel Function Recognition Rate Performance Function Gaussian Kernel Recognition Performance
Unable to display preview. Download preview PDF.
- 4.Boser, B., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: Proc. of the 5th Annual Workshop on Computational Learning Theory, pp. 144–152 (1992)Google Scholar
- 5.Boothby, W.M.: An Introduction to Differential Manifolds and Riemannian Geometry. Academic Press, London (1986)Google Scholar
- 8.Mio, W., Zhang, Q., Liu, X.: Nonlinearity and optimal component analysis. In: The Proceedings of the International Conference on Neural Networks (2005)Google Scholar
- 11.Srivastava, A., Liu, X.: Tools for Application-Driven Dimension Reduction. Neurocomputing (2005) (in press)Google Scholar
- 14.Zhang, Q., Liu, X.: Kernel optimal component analysis. In: The Proceedings of the IEEE Workshop on Learning in Computer Vision and Pattern Recognition (2004)Google Scholar