Skip to main content

Kernel Construction via Generalized Eigenvector Decomposition

  • Conference paper
Foundations of Intelligent Systems

Part of the book series: Advances in Intelligent and Soft Computing ((AINSC,volume 122))

  • 1674 Accesses

Abstract

Kernel construction is one of the key issues both in current research and application of kernel methods. In this paper, we present an effective kernel construction method, in which we reduce the construction of kernel function to the solutions of generalized eigenvalue problems. Specifically, we first obtain a primal kernel function based on the similarity of instances, and refine it with the conformal transformation. Then we determine the parameters of kernel function by simply solving the generalized eigenvalue problems according to the kernel alignment and Fisher criteria. Our method can avoid local maxima and insure positive semidefinite of the constructed kernel. Experimental results show that our kernel construction method is effective and robust.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 429.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 549.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S., Wu, S.: Improving support vector machine classifiers by modifying kernel functions. IEEE Transactions on Neural Networks 12(6), 783–789 (1999)

    Google Scholar 

  2. Aronszajn, N.: Theory of reproducing kernels. Transactions of the American Mathematical Society 68(3), 337–404 (1950)

    Article  MathSciNet  MATH  Google Scholar 

  3. Chapelle, O., Vapnik, V.: Model selection for support vector machines. In: Advances in Neural Information Processing Systems 12, pp. 230–236. MIT Press, Cambridge (1999)

    Google Scholar 

  4. Crammer, K., Keshet, J., Singer, Y.: Kernel design using boosting. In: Advances in Neural Information Processing Systems 15, pp. 537–544. MIT Press, Cambridge (2003)

    Google Scholar 

  5. Cristianini, N., Shawe-Taylor, J.: An introduction to support Vector Machines: and other kernel-based learning methods. Cambridge University Press (2000)

    Google Scholar 

  6. Cristianini, N., Shawe-taylor, J., Elisseeff, A., Kandola, J.: On kernel-target alignment. In: Advances in Neural Information Processing Systems 14, pp. 367–373. MIT Press, Cambridge (2002)

    Google Scholar 

  7. Hertz, T., Hillel, A., Weinshall, D.: Learning a kernel function for classification with small training samples. In: Proceedings of the 23rd international Conference on Machine Learning, pp. 401–408. ACM Press, New York (2006)

    Google Scholar 

  8. Merler, S., Jurman, G., Furlanello, C.: Deriving the Kernel from Training Data. In: Haindl, M., Kittler, J., Roli, F. (eds.) MCS 2007. LNCS, vol. 4472, pp. 32–41. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  9. Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mullers, K.: Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing IX, pp. 41–48. IEEE (1999)

    Google Scholar 

  10. Ong, C., Smola, A., Williamson, R.: Hyperkernels. In: Advances in Neural Information Processing Systems 15, pp. 478–485. MIT Press (2003)

    Google Scholar 

  11. Scholkopf, B., Smola, A., Muller, K.: Kernel Principal Component Analysis. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 583–588. Springer, Heidelberg (1997)

    Google Scholar 

  12. Sollich, P.: Bayesian methods for support vector machines: Evidence and predictive class probabilities. Machine Learning 46(1), 21–52 (2002)

    Article  MATH  Google Scholar 

  13. Vapnik, V.: The nature of statistical learning theory. Springer, Berlin (2000)

    MATH  Google Scholar 

  14. Xiong, H., Swamy, M., Ahmad, M.: Optimizing the kernel in the empirical feature space. IEEE Transactions on Neural Networks 16(2), 460–474 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liu, Y., Liao, S. (2011). Kernel Construction via Generalized Eigenvector Decomposition. In: Wang, Y., Li, T. (eds) Foundations of Intelligent Systems. Advances in Intelligent and Soft Computing, vol 122. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25664-6_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-25664-6_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-25663-9

  • Online ISBN: 978-3-642-25664-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics