Skip to main content

Sparse Functional Relevance Learning in Generalized Learning Vector Quantization

  • Conference paper
Advances in Self-Organizing Maps (WSOM 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6731))

Included in the following conference series:

Abstract

We propose a functional relevance learning for learning vector quantization of functional data. The relevance profile is taken as a superposition of a set of basis functions depending on only a few parameters compared to standard relevance learning. Moreover, the sparsity of the superposition is achieved by an entropy based penalty function forcing sparsity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Networks 15(8-9), 1059–1068 (2002)

    Article  Google Scholar 

  2. Kato, T.: On the adiabatic theorem of quantum mechanics. Journal of the Physical Society of Japan 5(6), 435–439 (1950)

    Article  Google Scholar 

  3. Kohonen, T.: Self-Organizing Maps. Springer Series in Information Sciences, vol. 30. Springer, Heidelberg (1995), 2nd extended edn. (1997)

    Google Scholar 

  4. Krier, C., Rossi, F., François, D., Verleysen, M.: A data-driven functional projection approach for the selection of feature ranges in spectra with ICA or cluster analysis. Chemometrics and Intelligent Laboratory Systems 91, 43–53 (2008)

    Article  Google Scholar 

  5. Krier, C., Verleysen, M., Rossi, F., François, D.: Supervised variable clustering for classification of NIR spectra. In: Verleysen, M. (ed.) Proceedings of 16th European Symposium on Artificial Neural Networks (ESANN), Bruges, Belgique, pp. 263–268 (2009)

    Google Scholar 

  6. Lee, J., Verleysen, M.: Generalization of the l p norm for time series and its application to self-organizing maps. In: Cottrell, M. (ed.) Proc. of Workshop on Self-Organizing Maps (WSOM) 2005, Paris, Sorbonne, pp. 733–740 (2005)

    Google Scholar 

  7. Lee, J., Verleysen, M.: Nonlinear Dimensionality Reduction. In: Information Sciences and Statistics. Springer Science+Business Media, New York (2007)

    Google Scholar 

  8. Meurens, M.: Wine data set, http://www.ucl.ac.be/mlg/index.php?page=databases.meurens.bnut.ucl.ac.be

  9. Rossi, F., Lendasse, A., François, D., Wertz, V., Verleysen, M.: Mutual information for the selection of relevant variables in spectrometric nonlinear modelling. Chemometrics and Intelligent Laboratory Systems 80, 215–226 (2006)

    Article  Google Scholar 

  10. Sato, A., Yamada, K.: Generalized learning vector quantization. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Proceedings of the 1995 Conference on Advances in Neural Information Processing Systems 8, pp. 423–429. MIT Press, Cambridge (1996)

    Google Scholar 

  11. Thodberg, H.: Tecator meat sample dataset, http://lib.stat.cmu.edu/datasets/tecator

  12. Villmann, T., Cichocki, A., Principe, J.: Information theory related learning. In: Verleysen, M. (ed.) Proc. of European Symposium on Artificial Neural Networks (ESANN 2011), Evere, Belgium, d-side publications (2011) (page in press)

    Google Scholar 

  13. Villmann, T., Haase, S.: Divergence based vector quantization. Neural Computation 23(5), 1343–1392 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  14. Villmann, T., Hammer, B.: Theoretical aspects of kernel GLVQ with differentiable kernel. IfI Technical Report Series (IfI-09-12), pp. 133–141 (2009)

    Google Scholar 

  15. Villmann, T., Schleif, F.-M.: Functional vector quantization by neural maps. In: Chanussot, J. (ed.) Proceedings of First Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS 2009), pp. 1–4. IEEE Press, Los Alamitos (2009); ISBN 978-1-4244-4948-4

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Villmann, T., Kästner, M. (2011). Sparse Functional Relevance Learning in Generalized Learning Vector Quantization. In: Laaksonen, J., Honkela, T. (eds) Advances in Self-Organizing Maps. WSOM 2011. Lecture Notes in Computer Science, vol 6731. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21566-7_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21566-7_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21565-0

  • Online ISBN: 978-3-642-21566-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics