Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3697))

Included in the following conference series:

Abstract

As shown in the bibliography, training an ensemble of networks is an interesting way to improve the performance. However there are several methods to construct the ensemble. In this paper we present some new results in a comparison of twenty different methods. We have trained ensembles of 3, 9, 20 and 40 networks to show results in a wide spectrum of values. The results show that the improvement in performance above 9 networks in the ensemble depends on the method but it is usually low. Also, the best method for a ensemble of 3 networks is called “Decorrelated” and uses a penalty term in the usual Backpropagation function to decorrelate the networks outputs in the ensemble. For the case of 9 and 20 networks the best method is conservative boosting. And finally for 40 networks the best method is Cels.

This research was supported by the project MAPACI TIC2002-02273 of CICYT in Spain.

An erratum to this chapter can be found at http://dx.doi.org/10.1007/11550907_163 .

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tumer, K., Ghosh, J.: Error correlation and error reduction in ensemble classifiers. Connection Science 8(3,4), 385–404 (1996)

    Article  Google Scholar 

  2. Raviv, Y., Intrator, N.: Bootstrapping with Noise: An Effective Regularization Technique. Connection Science 8(3,4), 355–372 (1996)

    Article  Google Scholar 

  3. Drucker, H., Cortes, C., Jackel, D., et al.: Boosting and Other Ensemble Methods. Neural Computation 6, 1289–1301 (1994)

    Article  MATH  Google Scholar 

  4. Fernández-Redondo, M., Hernández-Espinosa, C., Torres-Sospedra, J.: Classification by Multilayer Feedforward ensembles. In: Yin, F.-L., Wang, J., Guo, C. (eds.) ISNN 2004. LNCS, vol. 3173, pp. 852–857. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  5. Verikas, A., Lipnickas, A., Malmqvist, K., Bacauskiene, M., Gelzinis, A.: Soft Combination of neural classifiers: A comparative study. Pattern Recognition Letters 20, 429–444 (1999)

    Article  Google Scholar 

  6. Oza, N.C.: Boosting with Averaged Weight Vectors. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 15–24. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  7. Kuncheva, L.I.: Error Bounds for Aggressive and Conservative Adaboost. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 25–34. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  8. Breiman, L.: Arcing Classifiers. Annals of Statistic 26(3), 801–849 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  9. Liu, Y., Yao, X., Higuchi, T.: Evolutionary Ensembles with Negative Correlation Learning. IEEE Trans. on Evolutionary Computation 4(4), 380–387 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Torres-Sospedra, J., Hernández-Espinosa, C., Fernández-Redondo, M. (2005). New Results on Ensembles of Multilayer Feedforward. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds) Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005. ICANN 2005. Lecture Notes in Computer Science, vol 3697. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11550907_23

Download citation

  • DOI: https://doi.org/10.1007/11550907_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28755-1

  • Online ISBN: 978-3-540-28756-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics