A Probabilistic Iterative Local Search Algorithm Applied to Full Model Selection
Currently, there is no solution, which does not require a high runtime, to the problem of choosing preprocessing methods, feature selection algorithms and classifiers for a supervised learning problem. In this paper we present a method for efficiently finding a combination of algorithms and parameters that effectively describes a dataset. Furthermore, we present an optimization technique, based on ParamILS, which can be used in other contexts where each evaluation of the objective function is highly time consuming, but an estimate of this function is possible. In this paper, we present our algorithm and initial validation of it over real and synthetic data. In said validation, our proposal demonstrates a significant reduction in runtime, compared to ParamILS, while solving problems with these characteristics.
KeywordsFull Model Selection FMS Machine learning Challenge Iterative Local Search ILS
- 1.Wolpert, D.H., Macready, W.G.: No Free Lunch Theorems for Optimization. IEEE Transactions on evolutionary computation 1(1) (1997)Google Scholar
- 2.Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. In: Data Mining, Inference, and Prediction. Springer, Heidelberg (2001)Google Scholar
- 3.Zhang, Q., Sun, J.: Iterated Local Search with Guided Mutation. IEEE Congress on Evolutionary Computation (2006) Google Scholar
- 4.Escalante, H.J., Montes, M., Sucar, L.E.: Particle Swarm Model Selection. Journal of Machine Learning Research 10 (2009) Google Scholar
- 6.Smit, S., Eiben, A.: Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist. In: Di Chio, C., Cagnoni, S., Cotta, C., Ebner, M., Ekárt, A., Esparcia-Alcazar, A.I., Goh, C.-K., Merelo, J.J., Neri, F., Preuß, M., Togelius, J., Yannakakis, G.N. (eds.) EvoApplicatons 2010. LNCS, vol. 6024, pp. 542–551. Springer, Heidelberg (2010)CrossRefGoogle Scholar