Abstract
We propose a kernel function estimation strategy to support machine learning tasks by analyzing the input samples using Renyi’s Information Metrics. Specifically, we aim to identify a Reproducing Kernel Hilbert Space spanning the most widely the information force among data points by the maximization of the information potential variability of Parzen-based pdf estimation. So, a Gaussian kernel bandwidth updating rule is obtained as a function of the forces induced by a given dataset. Our proposal is tested on synthetic and real-world datasets related to clustering and classification tasks. Obtained results show that presented approach allows to compute RKHS’s favoring data groups separability, attaining suitable learning performances in comparison with state of the art algorithms.
Chapter PDF
References
Belanche, L.: Developments in kernel design. In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN, pp. 369–378 (2013)
Jenssen, R., Principe, J.C., Eltoft, T.: Information cut and information forces for clustering. In: NNSP, pp. 459–468 (September 2003)
Liu, W., Principe, J.C., Haykin, S.: Kernel Adaptive Filtering: A Comprehensive Introduction, vol. 57. John Wiley & Sons (2011)
Morejon, R.A., Principe, J.C.: Advanced search algorithms for information-theoretic learning with kernel-based estimators, 874–884 (2004)
Ng, A.Y., Jordan, M.I., Weiss, Y.: On Spectral Clustering: Analysis and an algorithm. Advances in Neural Information Processing Systems 14 (2001)
Pokharel, R., Seth, S., Principe, J.C.: Mixture kernel least mean square. In: The 2013 International Joint Conference on Neural Networks (IJCNN), pp. 1–7. IEEE (2013)
Principe, J.C.: Information theoretic learning: Rényi’s entropy and kernel perspectives. Springer (2010)
Giraldo, L.G.S., Principe, J.C.: Information theoretic learning with infinitely divisible kernels. arXiv preprint arXiv:1301.3551 (2013)
Silverman, B.W.: Density estimation for statistics and data analysis, vol. 26. CRC Press (1986)
Singh, A., Principe, J.C.: Kernel width adaptation in information theoretic cost functions. In: 2010 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), pp. 2062–2065 (2010)
Zelnik-Manor, L., Perona, P.: Self-Tuning Spectral Clustering. Advances in Neural Information Processing Systems 17(2), 1601–1608 (2004)
Zhang, X., Li, J., Yu, H.: Local density adaptive similarity measurement for spectral clustering. Pattern Recognition Letters 32(2), 352–358 (2011)
Zhao, S., Chen, B., Principe, J.C.: An adaptive kernel width update for correntropy. In: The 2012 International Joint Conference on Neural Networks (IJCNN), pp. 1–5. IEEE (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Álvarez-Meza, A.M., Cárdenas-Peña, D., Castellanos-Dominguez, G. (2014). Unsupervised Kernel Function Building Using Maximization of Information Potential Variability. In: Bayro-Corrochano, E., Hancock, E. (eds) Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. CIARP 2014. Lecture Notes in Computer Science, vol 8827. Springer, Cham. https://doi.org/10.1007/978-3-319-12568-8_41
Download citation
DOI: https://doi.org/10.1007/978-3-319-12568-8_41
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12567-1
Online ISBN: 978-3-319-12568-8
eBook Packages: Computer ScienceComputer Science (R0)