Abstract
Recently, databases have incremented their size in all areas of knowledge, considering both the number of instances and attributes. Current data sets may handle hundreds of thousands of variables with a high level of redundancy and/or irrelevancy. This amount of data may cause several problems to many data mining algorithms in terms of performance and scalability. In this work we present the state-of-the-art the for embedded feature selection using the classification method Support Vector Machine (SVM), presenting two additional works that can handle the new challenges in this area, such as simultaneous feature and model selection and highly imbalanced binary classification. We compare our approaches with other state-of-the-art algorithms to demonstrate their effectiveness and efficiency.
Chapter PDF
References
Bradley, P., Mangasarian, O.: Feature selection vía concave minimization and support vector machines. In: Int. Conference on Machine Learning, pp. 82–90 (1998)
Canu, S., Grandvalet, Y.: Adaptive scaling for feature selection in SVMs. In: Advances in NIPS, vol. 15, pp. 553–560. MIT Press, Cambridge (2002)
Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.A.: Feature extraction, foundations and applications. Springer, Berlin (2006)
Guyon, I., Saffari, A., Dror, G., Cawley, G.: Model selection: Beyond the Bayesian frequentist divide. JMLR 11, 61–87 (2009)
Maldonado, S., Weber, R.: A wrapper method for feature selection using Support Vector Machines. Information Sciences 179(13), 2208–2217 (2009)
Maldonado, S., Weber, R., Basak, J.: Kernel-Penalized SVM for Feature Selection. Information Sciences 181(1), 115–128 (2011)
Neumann, J., Schnörr, C., Steidl, G.: Combined SVM-Based Feature Selection and Classification. Machine Learning 61(1-3), 129–150 (2005)
Perkins, S., Lacker, K., Theiler, J.: Grafting: Fast incremental feature selection by gradient descent in function space. JMLR 3, 1333–1356 (2003)
Rakotomamonjy, A.: Variable Selection Using SVM-based Criteria. JMLR 3, 1357–1370 (2003)
Vapnik, V.: Statistical Learning Theory. John Wiley and Sons, New York (1998)
Weston, J., Mukherjee, S., Chapelle, O., Ponntil, M., Poggio, T., Vapnik, V.: Feature selection for SVMs. In: Advances in NIPS, vol. 13. MIT Press, Cambridge (2001)
Weston, J., Elisseeff, A., Schölkopf, B., Tipping, M.: The use of zero-norm with linear models and kernel methods. JMLR 3, 1439–1461 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Maldonado, S., Weber, R. (2011). Embedded Feature Selection for Support Vector Machines: State-of-the-Art and Future Challenges. In: San Martin, C., Kim, SW. (eds) Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. CIARP 2011. Lecture Notes in Computer Science, vol 7042. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25085-9_36
Download citation
DOI: https://doi.org/10.1007/978-3-642-25085-9_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-25084-2
Online ISBN: 978-3-642-25085-9
eBook Packages: Computer ScienceComputer Science (R0)