Skip to main content

Effect of Feature Selection on Bagging Classifiers Based on Kernel Density Estimators

  • Conference paper
  • 1756 Accesses

Abstract

A combination of classification rules (classifiers) is known as an Ensemble, and in general it is more accurate than the individual classifiers used to build it. One method to construct an Ensemble is Bagging introduced by Breiman, (1996). This method relies on resampling techniques to obtain different training sets for each of the classifiers. Previous work has shown that Bagging is very effective for unstable classifiers. In this paper we present some results in application of Bagging to classifiers where the class conditional density is estimated using kernel density estimators. The effect of feature selection in bagging is also considered.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • BAUER, E. and KOKAVI, R. (1999): An empirical comparison of voting classification algorithms: Bagging, Boosting and variants. Machine Learning, 36, 105–139.

    Article  Google Scholar 

  • BLAKE, C. and MERZ, C. (1998): UCI repository of machine learning databases. Department of Computer Science and Information, University of California, Irvine, USA.

    Google Scholar 

  • BREIMAN, L. (1996): Bagging Predictors. Machine Learning, 26, 123–140.

    Google Scholar 

  • BREIMAN, L. (1998): Arcing Classfiers. Annals of Statistic, 26, 801–849.

    Article  MathSciNet  MATH  Google Scholar 

  • DIETTERICH, T.G (2000): An Experimental comparison of three methods for constructing Ensembles of decision trees: Bagging, Boosting, and randomization. Machine Learning, 26, 801–849.

    Google Scholar 

  • FREUND, Y. and SCHAPIRE, R. (1996): Experiments with a new boosting algorithm. In Machine Learning, Proceedings of the Thirteenth International Conference, San Francisco, Morgan Kaufman, 148–156.

    Google Scholar 

  • KOHAVI, R. and JOHN, G.H. (1997): Wrappers for feature subset selection. Artificial Intelligence, 97, 273–324.

    Article  MATH  Google Scholar 

  • MACLIN, R. and OPTIZ, D. (1997): An empirical evaluation of Bagging and Bosting. Proceedings of the Fourteenth National Conference on Artificial Intelligence, AAAI/MIT Press.

    Google Scholar 

  • MICHIE, D., SPIGELHALTER, D.J. and TAYLOR, C.C. (1994): Machine Learning, Neural and Statistical Classification. London: Ellis Horwood.

    MATH  Google Scholar 

  • QUINLAN, J.R. (1996): Bagging, Boosting and C4.5. Proceedings of the Thirteenth National Conference on Artificial Intelligence, AAAI/MIT Press, 725–730.

    Google Scholar 

  • SILVERMAN, B.W. (1986): Density Estimation for Statistics and Data Analysis. Chapman and Hall. London.

    MATH  Google Scholar 

  • TITTERINGTON, D.M. (1980): A comparative study of kernel-based density estimates for categorical data. Technometrics, 22, 259–268.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Acuña, E., Rojas, A., Coaquira, F. (2002). Effect of Feature Selection on Bagging Classifiers Based on Kernel Density Estimators. In: Jajuga, K., Sokołowski, A., Bock, HH. (eds) Classification, Clustering, and Data Analysis. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-56181-8_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-56181-8_17

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43691-1

  • Online ISBN: 978-3-642-56181-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics