Abstract
Neural Gas (NG) algorithms aim to find optimal data representations based on feature vectors. Unlike SOM, NG algorithms take into consideration the dissimilarities between prototypes in the original input space and not on a grid defined in advance. It has been successfully applied in vector quantization and clustering. However, conventional NG algorithms implicitly assume that the variables have the same importance in the clustering task. Nevertheless, some variables may be irrelevant and, among the important ones, some may be more or less important than others to the clustering task. This paper provides an adaptive batch NG algorithm that, in comparison with the traditional batch NG algorithm, has an additional step where it automatically computes the importance of the variables in the clustering task. Experiments with synthetic and real datasets show the usefulness of the proposed adaptive NG algorithm.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
de Amorim, R.C., Mirkin, B.: Minkowski metric feature weighting and anomalous cluster initializing in k-means clustering. Pattern Recogn. 45, 1061–1075 (2012)
Bezdek, J.C.: Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum, New York (1981)
Cottrell, M., Hammer, B., Hassenfuß, A., Vilmann, T.: Batch and median neural gas. Neural Netw. 19(6), 762–771 (2006)
Diday, E., Govaert, G.: Classification automatique avec distances adaptatives. R.A.I.R.O. Informatique Comput. Sci. 11(4), 329–349 (1977)
Djenouri, Y., Belhadi, A., Fournier-Viger, P., Lin, J.C.W.: Fast and effective cluster-based information retrieval using frequent closed itemsets. Inf. Sci. 453, 154–167 (2018)
Dua, D., Taniskidou, E.K.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA (2017). http://archive.ics.uci.edu/ml
Hammer, B., Hassenfuß, A., Vilmann, T.: Magnification control for batch neural gas. Neurocomputing 70, 1225–1234 (2007)
Hammer, B., Strickert, M., Villmann, T.: Supervised neural gas with general similarity measure. Neural Process. Lett. 21, 21–44 (2005)
Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Netw. 15, 1059–1068 (2002)
Hubert, L., Arabie, P.: Comparing partitions. J. Classif. 3, 193–218 (1985)
Hunag, J.Z., Ng, M.K., Li, Z.: Automated variable weighting in k-means type clustering. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 657–668 (2005)
Jain, A.K.: Data clustering: 50 years beyond k-means. Pattern Recogn. Lett. 31(8), 651–666 (2010)
Kohonen, T.: Self-Organizing Maps. SSINF, vol. 30. Springer, Heidelberg (1995). https://doi.org/10.1007/978-3-642-97610-0
Kohonen, T.: Essentials of the self-organizing map. Neural Netw. 37, 52–65 (2013)
Lloyd, S.P.: Least squares quantization in PCM. IEEE Trans. Inform Theory 28(2), 129–137 (1982)
MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: LeCam, L.M., Neyman, J. (eds.) Proceedings of the 5th Berkeley Symposium on Mathematics Statistics, and Probability, pp. 281–297 (1967)
Martinetz, T., Berkovich, S.G., Schulten, K.J.: ‘Neural-gas’ network for vector quantization and its application to time-series prediction. IEEE Trans. Neural Networks 4, 558–569 (1993)
Martinetz, T., Schulten, K.: A ‘neural-gas’ network learns topologies. In: Mäkisara, K., Simula, O., Kangas, J., Kohonen, T. (eds.) Artificial Neural Networks, pp. 397–402. Elsevier, North-Holland, Amsterdam (1991)
Milligan, G.W.: Clustering validation: results and implications for applied analysis. Max M. Fisher College of Business, Ohio State University (1996)
Modha, D.S., Spangler, W.S.: Feature weighting in k-means clustering. Mach. Learn. 52(3), 217–237 (2003)
Pagnuco, I.A., Pastore, J.I., Abras, G., Brun, M., Ballarin, V.L.: Analysis of genetic association using hierarchical clustering and cluster validation indices. Genomics 109(5–6), 438–445 (2017)
Saxena, A., et al.: A review of clustering techniques and developments. Neurocomputing 267, 664–681 (2017)
Tsai, C.Y., Chiu, C.C.: Developing a feature weight self-adjustment mechanism for a k-means clustering algorithm. Comput. Stat. Data Anal. 52, 4658–4672 (2008)
Wazarkar, S., Keshavamurthy, B.N.: A survey on image data analysis through clustering techniques for real world applications. J. Vis. Commun. Image Represent. 55, 596–626 (2018)
Xu, R., Wunusch, D.I.I.: Survey of clustering algorithms. IEEE Trans. Neural Networks 16(3), 645–678 (2005)
Acknowledgment
The authors would like to thank the anonymous referees for their careful revision, and the CNPq, National Council for Scientific and Technological Development, Brazil (303187/2013-1), for its financial support.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Cavalcanti, N.L., Ferreira, M.R.P., Tenorio de Carvalho, F.d.A. (2019). Adaptive-\(L_2\) Batch Neural Gas. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning. ICANN 2019. Lecture Notes in Computer Science(), vol 11728. Springer, Cham. https://doi.org/10.1007/978-3-030-30484-3_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-30484-3_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-30483-6
Online ISBN: 978-3-030-30484-3
eBook Packages: Computer ScienceComputer Science (R0)