Skip to main content

Statistical Ensemble Method (SEM): A New Meta-machine Learning Approach Based on Statistical Techniques

  • Conference paper
Computational Intelligence and Bioinspired Systems (IWANN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3512))

Included in the following conference series:

  • 2911 Accesses

Abstract

The goal of combining the outputs of multiple models is to form an improved meta-model with higher generalization capability than the best single model used in isolation. Most popular ensemble methods do specify neither the number of component models nor their complexity. However, these parameters strongly influence the generalization capability of the meta-model. In this paper we propose an ensemble method which generates a meta-model with optimal values for these parameters. The proposed method suggests using resampling techniques to generate multiple estimations of the generalization error and multiple comparison procedures to select the models that will be combined to form the meta-model. Experimental results show the performance of the model on regression and classification tasks using artificial and real databases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.M.: Neural network for pattern recognition. Clarendon Press-Oxford (1995)

    Google Scholar 

  2. Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Irvine. Department of Information and Computer Science (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  3. Dietterich, T.G.: Machine Learning Research: Four Current Directions. Artificial Intelligence Magazine 18(4), 97–136 (1997)

    Google Scholar 

  4. Don Lehmkuhl, L.: Nonparametric statistics: methods for analyzing data not meeting assumptions required for the application of parametric tests. Journal of prosthetics and orthotics 8(3), 105–113 (1996)

    Article  Google Scholar 

  5. Guerrero, E., Yáñez, A., Galindo, P., Pizarro, J.: Repeated measures multiple comparison procedures applied to model selection in neural networks. In: Mira, J., Prieto, A.G. (eds.) IWANN 2001. LNCS, vol. 2085, pp. 88–95. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  6. Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)

    Article  Google Scholar 

  7. Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptative mixtures of local experts. Neural Computation 3(1), 79–87 (1991)

    Article  Google Scholar 

  8. Jutten, C., et al.: ESPIRIT basic research project number 689 ELENA, ftp.dice.ucl.ac.be/pub/neural-net/ELENA/databases

  9. Krogh, A., Vedelsby, J.: Neural networks ensembles, cross validation and active learning. In: Tesauro, G., Touretzky, D., Leen, T. (eds.) Advances in Neural Information Processing Systems, vol. 7, pp. 231–238. The MIT Press, Cambridge (1995)

    Google Scholar 

  10. Lasarev, M.R.: Methods for p-value adjustment, Oregon Health & Science University (2001), http://medir.ohsu.edu/~geneview/education/dec19_h.pdf

  11. Optiz, D.W., Shavlik, J.W.: Generating accurate and diverse members of a neural-network ensemble. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Advances in Neural Information Processing Systems, vol. 8, pp. 535–541. The MIT Press, Cambridge (1996)

    Google Scholar 

  12. Pace, R.K., Barry, R.: Sparse Spatial Autoregressions. Statistics and Probability Letters 33, 291–297 (1997), http://lib.stat.cmu.edu/

    Article  MATH  Google Scholar 

  13. Perrone, M.P., Cooper, L.N.: When networks disagree: ensemble method for neural networks. In: Mammone, R.J. (ed.) Artificial Neural Networks for Speech and Vision, pp. 126–142. Chapman & Hall, New York (1993)

    Google Scholar 

  14. Pizarro, J., Guerrero, E., Galindo, P.: Multiple comparison procedures applied to model selection. Neurocomputing 48, 152–159 (2001)

    Google Scholar 

  15. Sarle, W.: Donoho-Johnstone benchmarks: neural nets results (1999), ftp://ftp.sas.com/pub/neural/dojo/dojo.html

  16. Scharkey, A.J.C.: On Combining Artificial Neural Nets. Connection Science 8(3/4), 299–314 (1996)

    Article  Google Scholar 

  17. Yáñez, A.: Regresión mediante la combinación de modelos seleccionados mediante técnicas de remuestreo y procedimientos de comparación múltiple. Thesis. University of Cádiz

    Google Scholar 

  18. Zar, J.H.: Biostatistical analysis. Prentice-Hall, Englewood Cliffs (1996)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Escolano, A.Y., Riaño, P.G., Junquera, J.P., Vázquez, E.G. (2005). Statistical Ensemble Method (SEM): A New Meta-machine Learning Approach Based on Statistical Techniques. In: Cabestany, J., Prieto, A., Sandoval, F. (eds) Computational Intelligence and Bioinspired Systems. IWANN 2005. Lecture Notes in Computer Science, vol 3512. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494669_24

Download citation

  • DOI: https://doi.org/10.1007/11494669_24

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26208-4

  • Online ISBN: 978-3-540-32106-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics