Abstract
Kernel construction is one of the key issues both in current research and application of kernel methods. In this paper, we present an effective kernel construction method, in which we reduce the construction of kernel function to the solutions of generalized eigenvalue problems. Specifically, we first obtain a primal kernel function based on the similarity of instances, and refine it with the conformal transformation. Then we determine the parameters of kernel function by simply solving the generalized eigenvalue problems according to the kernel alignment and Fisher criteria. Our method can avoid local maxima and insure positive semidefinite of the constructed kernel. Experimental results show that our kernel construction method is effective and robust.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Amari, S., Wu, S.: Improving support vector machine classifiers by modifying kernel functions. IEEE Transactions on Neural Networks 12(6), 783–789 (1999)
Aronszajn, N.: Theory of reproducing kernels. Transactions of the American Mathematical Society 68(3), 337–404 (1950)
Chapelle, O., Vapnik, V.: Model selection for support vector machines. In: Advances in Neural Information Processing Systems 12, pp. 230–236. MIT Press, Cambridge (1999)
Crammer, K., Keshet, J., Singer, Y.: Kernel design using boosting. In: Advances in Neural Information Processing Systems 15, pp. 537–544. MIT Press, Cambridge (2003)
Cristianini, N., Shawe-Taylor, J.: An introduction to support Vector Machines: and other kernel-based learning methods. Cambridge University Press (2000)
Cristianini, N., Shawe-taylor, J., Elisseeff, A., Kandola, J.: On kernel-target alignment. In: Advances in Neural Information Processing Systems 14, pp. 367–373. MIT Press, Cambridge (2002)
Hertz, T., Hillel, A., Weinshall, D.: Learning a kernel function for classification with small training samples. In: Proceedings of the 23rd international Conference on Machine Learning, pp. 401–408. ACM Press, New York (2006)
Merler, S., Jurman, G., Furlanello, C.: Deriving the Kernel from Training Data. In: Haindl, M., Kittler, J., Roli, F. (eds.) MCS 2007. LNCS, vol. 4472, pp. 32–41. Springer, Heidelberg (2007)
Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mullers, K.: Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing IX, pp. 41–48. IEEE (1999)
Ong, C., Smola, A., Williamson, R.: Hyperkernels. In: Advances in Neural Information Processing Systems 15, pp. 478–485. MIT Press (2003)
Scholkopf, B., Smola, A., Muller, K.: Kernel Principal Component Analysis. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 583–588. Springer, Heidelberg (1997)
Sollich, P.: Bayesian methods for support vector machines: Evidence and predictive class probabilities. Machine Learning 46(1), 21–52 (2002)
Vapnik, V.: The nature of statistical learning theory. Springer, Berlin (2000)
Xiong, H., Swamy, M., Ahmad, M.: Optimizing the kernel in the empirical feature space. IEEE Transactions on Neural Networks 16(2), 460–474 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, Y., Liao, S. (2011). Kernel Construction via Generalized Eigenvector Decomposition. In: Wang, Y., Li, T. (eds) Foundations of Intelligent Systems. Advances in Intelligent and Soft Computing, vol 122. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25664-6_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-25664-6_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-25663-9
Online ISBN: 978-3-642-25664-6
eBook Packages: EngineeringEngineering (R0)