Abstract
Kernel PCA, like other kernel-based techniques, is suffered from memory requirement and computational problems as well as from a tedious training procedure. This work shows that the objective function of Kernel PCA, i.e. the reconstruction error can be upper bounded by the distortion of K-means algorithm in the feature space. From this relation, we propose a simplification of Kernel PCA’s training procedure by Kernel K-means algorithm. The application of preimage reconstruction algorithm allows further simplification and leads to a more computational economic solution.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Boser, B., Guyon, I., Vapnik, V.: An training algorithm for optimal margin classifiers. In: Fifth Annual Workshop on Computational Learning Theory, pp. 144–152 (1992)
Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)
Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Muller, K.-R.: Fisher discriminant analysis with kernels. Neural Networks for Signal Processing IX, 41–48 (1999)
Burges, C.J.: Simplified support vector decision rule. In: International Conference on Machine Learning, pp. 71–77 (1996)
Schölkopf, B., Mika, S., Burges, C.J., Knirsch, P., Müller, K.R., Rätsch, G., Smola, A.J.: Input space versus feature space in kernel-based method. IEEE Transaction on Neural Networks 10(5), 1000–1017 (1999)
Tipping, M.E.: Sparse kernel principal component analysis. In: Neural Information Processing Systems (NIPS), pp. 633–639 (2000)
Achlioptas, D., Frank McSherry, B.S.: Sampling techniques for kernel methods. In: Neural Information Processing Systems (NIPS) (2002)
Franc, V., Hlavac, V.: Greedy algorithm for a training set reduction in the kernel methods. In: Computer Analysis of Images and Patterns: Proceedings of the 10th International Conference, Groningen, The Netherlands, pp. 426–433. Springer, Heidelberg (2003)
Kim, K.I., Franz, M.O., Schölkopf, B.: Iterative kernel principal component analysis for image modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(9), 1351–1366 (2005)
Zha, H., He, X., Ding, C., Simon, H., Gu, M.: Spectral relaxation for k-means clustering. In: Neural Information Processing Systems (NIPS) (2001)
Dhillon, I.S., Guan, Y., Kulis, B.: Kernel k-means, spectral clustering and normalized cuts. In: International Conference on Knowledge Discovery and Data Mining (KDD) (2004)
Ding, C., He, X.: K-means clustering via principal component analysis. In: International Conference on Machine Learning, Banff, Canada (2004)
Liu, C.-L., Nakashima, K., Sako, H., Fujisawa, H.: Handwritten digit recognition: benchmarking of state-of-the-art techniques. Pattern Recognition 36(10), 2271–2285 (2003)
Joachims, T.: Making large-scale svm learning practical. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning, MIT Press, Cambridge (1999)
Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. Software (2001), available at http://www.csie.ntu.edu.tw/~cjlin/libsvm
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Marukatat, S. (2006). Sparse Kernel PCA by Kernel K-Means and Preimage Reconstruction Algorithms. In: Yang, Q., Webb, G. (eds) PRICAI 2006: Trends in Artificial Intelligence. PRICAI 2006. Lecture Notes in Computer Science(), vol 4099. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-36668-3_49
Download citation
DOI: https://doi.org/10.1007/978-3-540-36668-3_49
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-36667-6
Online ISBN: 978-3-540-36668-3
eBook Packages: Computer ScienceComputer Science (R0)