Abstract
We propose a functional relevance learning for learning vector quantization of functional data. The relevance profile is taken as a superposition of a set of basis functions depending on only a few parameters compared to standard relevance learning. Moreover, the sparsity of the superposition is achieved by an entropy based penalty function forcing sparsity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Networks 15(8-9), 1059–1068 (2002)
Kato, T.: On the adiabatic theorem of quantum mechanics. Journal of the Physical Society of Japan 5(6), 435–439 (1950)
Kohonen, T.: Self-Organizing Maps. Springer Series in Information Sciences, vol. 30. Springer, Heidelberg (1995), 2nd extended edn. (1997)
Krier, C., Rossi, F., François, D., Verleysen, M.: A data-driven functional projection approach for the selection of feature ranges in spectra with ICA or cluster analysis. Chemometrics and Intelligent Laboratory Systems 91, 43–53 (2008)
Krier, C., Verleysen, M., Rossi, F., François, D.: Supervised variable clustering for classification of NIR spectra. In: Verleysen, M. (ed.) Proceedings of 16th European Symposium on Artificial Neural Networks (ESANN), Bruges, Belgique, pp. 263–268 (2009)
Lee, J., Verleysen, M.: Generalization of the l p norm for time series and its application to self-organizing maps. In: Cottrell, M. (ed.) Proc. of Workshop on Self-Organizing Maps (WSOM) 2005, Paris, Sorbonne, pp. 733–740 (2005)
Lee, J., Verleysen, M.: Nonlinear Dimensionality Reduction. In: Information Sciences and Statistics. Springer Science+Business Media, New York (2007)
Meurens, M.: Wine data set, http://www.ucl.ac.be/mlg/index.php?page=databases.meurens.bnut.ucl.ac.be
Rossi, F., Lendasse, A., François, D., Wertz, V., Verleysen, M.: Mutual information for the selection of relevant variables in spectrometric nonlinear modelling. Chemometrics and Intelligent Laboratory Systems 80, 215–226 (2006)
Sato, A., Yamada, K.: Generalized learning vector quantization. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Proceedings of the 1995 Conference on Advances in Neural Information Processing Systems 8, pp. 423–429. MIT Press, Cambridge (1996)
Thodberg, H.: Tecator meat sample dataset, http://lib.stat.cmu.edu/datasets/tecator
Villmann, T., Cichocki, A., Principe, J.: Information theory related learning. In: Verleysen, M. (ed.) Proc. of European Symposium on Artificial Neural Networks (ESANN 2011), Evere, Belgium, d-side publications (2011) (page in press)
Villmann, T., Haase, S.: Divergence based vector quantization. Neural Computation 23(5), 1343–1392 (2011)
Villmann, T., Hammer, B.: Theoretical aspects of kernel GLVQ with differentiable kernel. IfI Technical Report Series (IfI-09-12), pp. 133–141 (2009)
Villmann, T., Schleif, F.-M.: Functional vector quantization by neural maps. In: Chanussot, J. (ed.) Proceedings of First Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS 2009), pp. 1–4. IEEE Press, Los Alamitos (2009); ISBN 978-1-4244-4948-4
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Villmann, T., Kästner, M. (2011). Sparse Functional Relevance Learning in Generalized Learning Vector Quantization. In: Laaksonen, J., Honkela, T. (eds) Advances in Self-Organizing Maps. WSOM 2011. Lecture Notes in Computer Science, vol 6731. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21566-7_8
Download citation
DOI: https://doi.org/10.1007/978-3-642-21566-7_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21565-0
Online ISBN: 978-3-642-21566-7
eBook Packages: Computer ScienceComputer Science (R0)