Abstract
Discovering the most important variables is a crucial step for accelerating model building without losing potential predictive power of the data. In many practical problems is necessary to discover the dependant variables and the ones that are redundant. In this paper an automatic method for discovering the most important signals or characteristics to build data-driven models is presented. This method was developed thinking in a very high dimensionality inputs spaces, where many variables are independent, but existing many others which are combinations of the independent ones. The base of the method are the SOM neural network and a method for feature weighting very similar to Linear Discriminant Analysis (LDA) with some modifications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Oja, E.: Principal Components. Minor Components, and Linear Neural Networks Neural Networks 5, 927–935 (1992)
Hyvärinen, A., Oja, E.: Independent component analysis: algorithms and applications. Neural Networks 13, 411–430 (2000)
Castellano, G., Fanelli, A.M.: Feature selection: a neural approach. Neural Networks 5, 3156–3160 (1999)
Kohonen, T.: Essentials of the self-organizing map. Neural Networks 37, 52–65 (2013)
Theodoridis, S., Koutroumbas, K.: Pattern Recognition. 3rd edition. Academic Press (2006)
Wu, J.-L., Li, I.-J.: The Improved SOM-Based Dimensionality Reducton Method for KNN Classifier Using Weighted Euclidean Metric. International Journal of Computer, Consumer and Control (IJ3C) 3(1) (2014)
Nathans, L.L., Oswald, F.L., Nimon, K.: Interpreting Multiple Linear Regression: A Guidebook of Variable Importance. Practical Assessment, Research & Evaluation 17(9) (2012)
Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall (1999)
Yuan, J.-L., Fine, T.L.: Neural-network design for small training sets of high dimension. IEEE Transactions on Neural Networks 9(2), 266–280 (1998)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Pavón, F., Vega, J., Canto, S.D. (2015). SOM and Feature Weights Based Method for Dimensionality Reduction in Large Gauss Linear Models. In: Gammerman, A., Vovk, V., Papadopoulos, H. (eds) Statistical Learning and Data Sciences. SLDS 2015. Lecture Notes in Computer Science(), vol 9047. Springer, Cham. https://doi.org/10.1007/978-3-319-17091-6_32
Download citation
DOI: https://doi.org/10.1007/978-3-319-17091-6_32
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-17090-9
Online ISBN: 978-3-319-17091-6
eBook Packages: Computer ScienceComputer Science (R0)