Skip to main content

Ensemble Learning: A Study on Different Variants of the Dynamic Selection Approach

  • Conference paper
Machine Learning and Data Mining in Pattern Recognition (MLDM 2009)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5632))

Abstract

Integration methods for ensemble learning can use two different approaches: combination or selection. The combination approach (also called fusion) consists on the combination of the predictions obtained by different models in the ensemble to obtain the final ensemble prediction. The selection approach selects one (or more) models from the ensemble according to the prediction performance of these models on similar data from the validation set. Usually, the method to select similar data is the k-nearest neighbors with the Euclidean distance. In this paper we discuss other approaches to obtain similar data for the regression problem. We show that using similarity measures according to the target values improves results. We also show that selecting dynamically several models for the prediction task increases prediction accuracy comparing to the selection of just one model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Arya, S., Mount, D.M., Netanyahu, N.S., Silverman, R., Wu, A.Y.: An optimal algorithm for approximate nearest neighbor searching. Journal of the ACM 45(6), 891–923 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bentley, J.L.: Multidimensional binary search trees used for associative searching. Communications of the ACM 18(9), 509–517 (1975)

    Article  MATH  Google Scholar 

  3. Breiman, L.: Bagging predictors. Machine Learning 26, 123–140 (1996)

    MATH  Google Scholar 

  4. Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  5. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and regression trees. Chapman and Hall/CRC, Boca Raton (1984)

    MATH  Google Scholar 

  6. Caruana, R., Niculescu-Mozil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: International Conference on Machine Learning (2004)

    Google Scholar 

  7. Didaci, L., Giacinto, G.: Dynamic classifier selection by adaptive k-nearest neighbourhood rule. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 174–183. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  8. Dietterich, T.G.: Approximate statistical tests for comparing supervised classification learning algorithms. Neural computation 10, 1895–1923 (1998)

    Article  Google Scholar 

  9. Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)

    Google Scholar 

  10. García-Pedrajas, N., Hervás-Martínez, C., Ortiz-Boyer, D.: Cooperative coevolution of artificial neural network ensembles for pattern classification. IEEE Transactions on Evolutionary Computation 9(3), 271–302 (2005)

    Article  Google Scholar 

  11. Giacinto, G., Roli, F.: Adaptive selection of image classifiers. In: Del Bimbo, A. (ed.) ICIAP 1997. LNCS, vol. 1310, pp. 38–45. Springer, Heidelberg (1997)

    Chapter  Google Scholar 

  12. Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(6), 607–616 (1996)

    Article  Google Scholar 

  13. Kemp, S.E.: knnfinder: Fast near neighbour search. R package version 1.0

    Google Scholar 

  14. Ko, A.H.-R., Sabourin, R., Britto Jr., A.d.S.: From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition 41, 1718–1731 (2008)

    Article  MATH  Google Scholar 

  15. Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, vol. 7, pp. 231–238 (1995)

    Google Scholar 

  16. Kuncheva, L.I.: Switching between selection and fusion in combining classifiers: an experiment. IEEE Transactions on Systems, Man, and Cybernetics-Part B 32(2), 146–156 (2002)

    Article  Google Scholar 

  17. Lilliefors, H.W.: On the kolmogorov-smirnov test for normality with mean and variance unknown. Journal of the American Statistical Association 62(318), 399–402 (1967)

    Article  Google Scholar 

  18. Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4(4), 380–387 (2000)

    Article  Google Scholar 

  19. Merz, C.J.: Dynamical selection of learning algorithms. In: Fisher, D., Lenz, H.-J. (eds.) International Workshop on Artificial Intelligence and Statistics. Learning from Data: Artificial Intelligence and Statistics, vol. V. Springer, Heidelberg (1996)

    Google Scholar 

  20. Merz, C.J.: Classification and regression by combining models. Phd thesis, University of California, USA (1998)

    Google Scholar 

  21. Prudêncio, R.B.C., Ludermir, T.B.: Meta-learning approaches to selecting time series models. Neurocomputing 61, 121–137 (2004)

    Article  Google Scholar 

  22. Puuronen, S., Terziyan, V., Tsymbal, A.: A dynamic integration algorithm for an ensemble of classifiers. In: Raś, Z.W., Skowron, A. (eds.) ISMIS 1999. LNCS, vol. 1609, pp. 592–600. Springer, Heidelberg (1999)

    Chapter  Google Scholar 

  23. Robnik-Šikonja, M., Kononenko, I.: Theoretical and empirical analysis of relieff and rrelieff. Machine Learning 53(1-2), 23–69 (2003)

    Article  MATH  Google Scholar 

  24. Rooney, N., Patterson, D., Anand, S., Tsymbal, A.: Dynamic integration of regression models. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 164–173. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  25. R. D. C. Team. R: A language and environment for statistical computing. Technical report, R Foundation for Statistical Computing (2006) ISBN 3-900051-07-0

    Google Scholar 

  26. Torgo, L.: Regression data repository

    Google Scholar 

  27. Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Dynamic integration with random forests. Tech. Report TCD-CS-2006-23, The University of Dublin, Trinity College (2006)

    Google Scholar 

  28. Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Dynamic integration with random forests. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds.) ECML 2006. LNCS (LNAI), vol. 4212, pp. 801–808. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  29. Tsymbal, A., Puuronen, S.: Bagging and boosting with dynamic integration of classifiers. In: Zighed, D.A., Komorowski, J., Żytkow, J.M. (eds.) PKDD 2000. LNCS (LNAI), vol. 1910, pp. 116–125. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  30. Woods, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(4), 405–410 (1997)

    Article  Google Scholar 

  31. Xycoon. Statistics - econometrics - forecasting

    Google Scholar 

  32. Yankov, D., DeCoste, D., Keogh, E.: Ensembles of nearest neighbor forecasts. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds.) ECML 2006. LNCS (LNAI), vol. 4212, pp. 545–556. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mendes-Moreira, J., Jorge, A.M., Soares, C., de Sousa, J.F. (2009). Ensemble Learning: A Study on Different Variants of the Dynamic Selection Approach. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2009. Lecture Notes in Computer Science(), vol 5632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03070-3_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-03070-3_15

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-03069-7

  • Online ISBN: 978-3-642-03070-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics