Abstract
Integration methods for ensemble learning can use two different approaches: combination or selection. The combination approach (also called fusion) consists on the combination of the predictions obtained by different models in the ensemble to obtain the final ensemble prediction. The selection approach selects one (or more) models from the ensemble according to the prediction performance of these models on similar data from the validation set. Usually, the method to select similar data is the k-nearest neighbors with the Euclidean distance. In this paper we discuss other approaches to obtain similar data for the regression problem. We show that using similarity measures according to the target values improves results. We also show that selecting dynamically several models for the prediction task increases prediction accuracy comparing to the selection of just one model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arya, S., Mount, D.M., Netanyahu, N.S., Silverman, R., Wu, A.Y.: An optimal algorithm for approximate nearest neighbor searching. Journal of the ACM 45(6), 891–923 (1998)
Bentley, J.L.: Multidimensional binary search trees used for associative searching. Communications of the ACM 18(9), 509–517 (1975)
Breiman, L.: Bagging predictors. Machine Learning 26, 123–140 (1996)
Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)
Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and regression trees. Chapman and Hall/CRC, Boca Raton (1984)
Caruana, R., Niculescu-Mozil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: International Conference on Machine Learning (2004)
Didaci, L., Giacinto, G.: Dynamic classifier selection by adaptive k-nearest neighbourhood rule. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 174–183. Springer, Heidelberg (2004)
Dietterich, T.G.: Approximate statistical tests for comparing supervised classification learning algorithms. Neural computation 10, 1895–1923 (1998)
Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)
García-Pedrajas, N., Hervás-Martínez, C., Ortiz-Boyer, D.: Cooperative coevolution of artificial neural network ensembles for pattern classification. IEEE Transactions on Evolutionary Computation 9(3), 271–302 (2005)
Giacinto, G., Roli, F.: Adaptive selection of image classifiers. In: Del Bimbo, A. (ed.) ICIAP 1997. LNCS, vol. 1310, pp. 38–45. Springer, Heidelberg (1997)
Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(6), 607–616 (1996)
Kemp, S.E.: knnfinder: Fast near neighbour search. R package version 1.0
Ko, A.H.-R., Sabourin, R., Britto Jr., A.d.S.: From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition 41, 1718–1731 (2008)
Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, vol. 7, pp. 231–238 (1995)
Kuncheva, L.I.: Switching between selection and fusion in combining classifiers: an experiment. IEEE Transactions on Systems, Man, and Cybernetics-Part B 32(2), 146–156 (2002)
Lilliefors, H.W.: On the kolmogorov-smirnov test for normality with mean and variance unknown. Journal of the American Statistical Association 62(318), 399–402 (1967)
Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4(4), 380–387 (2000)
Merz, C.J.: Dynamical selection of learning algorithms. In: Fisher, D., Lenz, H.-J. (eds.) International Workshop on Artificial Intelligence and Statistics. Learning from Data: Artificial Intelligence and Statistics, vol. V. Springer, Heidelberg (1996)
Merz, C.J.: Classification and regression by combining models. Phd thesis, University of California, USA (1998)
Prudêncio, R.B.C., Ludermir, T.B.: Meta-learning approaches to selecting time series models. Neurocomputing 61, 121–137 (2004)
Puuronen, S., Terziyan, V., Tsymbal, A.: A dynamic integration algorithm for an ensemble of classifiers. In: Raś, Z.W., Skowron, A. (eds.) ISMIS 1999. LNCS, vol. 1609, pp. 592–600. Springer, Heidelberg (1999)
Robnik-Šikonja, M., Kononenko, I.: Theoretical and empirical analysis of relieff and rrelieff. Machine Learning 53(1-2), 23–69 (2003)
Rooney, N., Patterson, D., Anand, S., Tsymbal, A.: Dynamic integration of regression models. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 164–173. Springer, Heidelberg (2004)
R. D. C. Team. R: A language and environment for statistical computing. Technical report, R Foundation for Statistical Computing (2006) ISBN 3-900051-07-0
Torgo, L.: Regression data repository
Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Dynamic integration with random forests. Tech. Report TCD-CS-2006-23, The University of Dublin, Trinity College (2006)
Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Dynamic integration with random forests. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds.) ECML 2006. LNCS (LNAI), vol. 4212, pp. 801–808. Springer, Heidelberg (2006)
Tsymbal, A., Puuronen, S.: Bagging and boosting with dynamic integration of classifiers. In: Zighed, D.A., Komorowski, J., Żytkow, J.M. (eds.) PKDD 2000. LNCS (LNAI), vol. 1910, pp. 116–125. Springer, Heidelberg (2000)
Woods, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(4), 405–410 (1997)
Xycoon. Statistics - econometrics - forecasting
Yankov, D., DeCoste, D., Keogh, E.: Ensembles of nearest neighbor forecasts. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds.) ECML 2006. LNCS (LNAI), vol. 4212, pp. 545–556. Springer, Heidelberg (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Mendes-Moreira, J., Jorge, A.M., Soares, C., de Sousa, J.F. (2009). Ensemble Learning: A Study on Different Variants of the Dynamic Selection Approach. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2009. Lecture Notes in Computer Science(), vol 5632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03070-3_15
Download citation
DOI: https://doi.org/10.1007/978-3-642-03070-3_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03069-7
Online ISBN: 978-3-642-03070-3
eBook Packages: Computer ScienceComputer Science (R0)