Skip to main content

An Empirical Study of Multilayer Perceptron Ensembles for Regression Tasks

  • Conference paper
Trends in Applied Intelligent Systems (IEA/AIE 2010)

Abstract

This work presents an experimental study of ensemble methods for regression, using Multilayer Perceptrons (MLP) as the base method and 61 datasets. The considered ensemble methods are Randomization, Random Subspaces, Bagging, Iterated Bagging and AdaBoost.R2. Surprisingly, because it is in contradiction to previous studies, the best overall results are for Bagging. The cause of this difference can be the base methods, MLP instead of regression or model trees. Diversity-error diagrams are used to analyze the behaviour of the ensemble methods. Compared to Bagging, the additional diversity obtained with other methods do not compensate the increase in the errors of the ensemble members.

This work was supported by the Project 2009/00204/001 of “Caja de Burgos” and University of Burgos and the Project TIN2008-03151 of the Spanish Ministry of Education and Science.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley Interscience, Hoboken (2004)

    Book  MATH  Google Scholar 

  2. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  3. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  4. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: 13th International Conference on Machine Learning, pp. 148–156. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  5. Drucker, H.: Improving regressors using boosting techniques. In: ICML 1997: Proceedings of the Fourteenth International Conference on Machine Learning, pp. 107–115. Morgan Kaufmann Publishers Inc., San Francisco (1997)

    Google Scholar 

  6. Zhang, C., Zhang, J., Wang, G.: An empirical study of using rotation forest to improve regressors. Applied Mathematics and Computation 195(2), 618–629 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  7. Breiman, L.: Using iterated bagging to debias regressions. Machine Learning 45(3), 261–277 (2001)

    Article  MATH  Google Scholar 

  8. Suen, Y., Melville, P., Mooney, R.: Combining bias and variance reduction techniques for regression trees. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds.) ECML 2005. LNCS (LNAI), vol. 3720, pp. 741–749. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  9. Brown, G., Wyatt, J.L., Tiňo, P.: Managing diversity in regression ensembles. Journal of Machine Learning Research 6, 1621–1650 (2005)

    MathSciNet  MATH  Google Scholar 

  10. Dietterich, T.G.: Approximate statistical test for comparing supervised classification learning algorithms. Neural Computation 10(7), 1895–1923 (1998)

    Article  Google Scholar 

  11. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  12. Wang, Y., Witten, I.H.: Induction of model trees for predicting continuous classes. In: Poster papers of the 9th European Conference on Machine Learning. Springer, Heidelberg (1997)

    Google Scholar 

  13. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005), http://www.cs.waikato.ac.nz/ml/weka

    MATH  Google Scholar 

  14. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  15. Shrestha, D.L., Solomatine, D.P.: Experiments with AdaBoost.RT, an improved boosting scheme for regression. Neural Computation 18(7), 1678–1710 (2006)

    Article  MATH  Google Scholar 

  16. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles. Machine Learning 51, 181–207 (2003)

    Article  MATH  Google Scholar 

  17. Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Proc. 14th International Conference on Machine Learning, pp. 211–218. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  18. Rodríguez, J.J., Maudes, J., Pardo, C., García-Osorio, C.: Disturbing neighbors ensembles for regression. In: XIII Conference of the Spanish Association for Artificial Intelligence, CAEPIA - TTIA 2009, pp. 369–378 (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pardo, C., Rodríguez, J.J., García-Osorio, C., Maudes, J. (2010). An Empirical Study of Multilayer Perceptron Ensembles for Regression Tasks. In: García-Pedrajas, N., Herrera, F., Fyfe, C., Benítez, J.M., Ali, M. (eds) Trends in Applied Intelligent Systems. IEA/AIE 2010. Lecture Notes in Computer Science(), vol 6097. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13025-0_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-13025-0_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-13024-3

  • Online ISBN: 978-3-642-13025-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics