Skip to main content

Sparse Kernel PCA by Kernel K-Means and Preimage Reconstruction Algorithms

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4099))

Abstract

Kernel PCA, like other kernel-based techniques, is suffered from memory requirement and computational problems as well as from a tedious training procedure. This work shows that the objective function of Kernel PCA, i.e. the reconstruction error can be upper bounded by the distortion of K-means algorithm in the feature space. From this relation, we propose a simplification of Kernel PCA’s training procedure by Kernel K-means algorithm. The application of preimage reconstruction algorithm allows further simplification and leads to a more computational economic solution.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   239.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Boser, B., Guyon, I., Vapnik, V.: An training algorithm for optimal margin classifiers. In: Fifth Annual Workshop on Computational Learning Theory, pp. 144–152 (1992)

    Google Scholar 

  2. Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)

    Article  Google Scholar 

  3. Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Muller, K.-R.: Fisher discriminant analysis with kernels. Neural Networks for Signal Processing IX, 41–48 (1999)

    Google Scholar 

  4. Burges, C.J.: Simplified support vector decision rule. In: International Conference on Machine Learning, pp. 71–77 (1996)

    Google Scholar 

  5. Schölkopf, B., Mika, S., Burges, C.J., Knirsch, P., Müller, K.R., Rätsch, G., Smola, A.J.: Input space versus feature space in kernel-based method. IEEE Transaction on Neural Networks 10(5), 1000–1017 (1999)

    Article  Google Scholar 

  6. Tipping, M.E.: Sparse kernel principal component analysis. In: Neural Information Processing Systems (NIPS), pp. 633–639 (2000)

    Google Scholar 

  7. Achlioptas, D., Frank McSherry, B.S.: Sampling techniques for kernel methods. In: Neural Information Processing Systems (NIPS) (2002)

    Google Scholar 

  8. Franc, V., Hlavac, V.: Greedy algorithm for a training set reduction in the kernel methods. In: Computer Analysis of Images and Patterns: Proceedings of the 10th International Conference, Groningen, The Netherlands, pp. 426–433. Springer, Heidelberg (2003)

    Google Scholar 

  9. Kim, K.I., Franz, M.O., Schölkopf, B.: Iterative kernel principal component analysis for image modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(9), 1351–1366 (2005)

    Article  Google Scholar 

  10. Zha, H., He, X., Ding, C., Simon, H., Gu, M.: Spectral relaxation for k-means clustering. In: Neural Information Processing Systems (NIPS) (2001)

    Google Scholar 

  11. Dhillon, I.S., Guan, Y., Kulis, B.: Kernel k-means, spectral clustering and normalized cuts. In: International Conference on Knowledge Discovery and Data Mining (KDD) (2004)

    Google Scholar 

  12. Ding, C., He, X.: K-means clustering via principal component analysis. In: International Conference on Machine Learning, Banff, Canada (2004)

    Google Scholar 

  13. Liu, C.-L., Nakashima, K., Sako, H., Fujisawa, H.: Handwritten digit recognition: benchmarking of state-of-the-art techniques. Pattern Recognition 36(10), 2271–2285 (2003)

    Article  MATH  Google Scholar 

  14. Joachims, T.: Making large-scale svm learning practical. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning, MIT Press, Cambridge (1999)

    Google Scholar 

  15. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. Software (2001), available at http://www.csie.ntu.edu.tw/~cjlin/libsvm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Marukatat, S. (2006). Sparse Kernel PCA by Kernel K-Means and Preimage Reconstruction Algorithms. In: Yang, Q., Webb, G. (eds) PRICAI 2006: Trends in Artificial Intelligence. PRICAI 2006. Lecture Notes in Computer Science(), vol 4099. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-36668-3_49

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-36668-3_49

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36667-6

  • Online ISBN: 978-3-540-36668-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics