Skip to main content

Gated Boosting: Efficient Classifier Boosting and Combining

  • Conference paper
KI 2012: Advances in Artificial Intelligence (KI 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7526))

Included in the following conference series:

Abstract

We study boosting by using a gating mechanism, Gated Boosting, to perform resampling instead of the weighting mechanism used in Adaboost. In our method, gating networks determine the distribution of the samples for training a consecutive base classifier, considering the predictions of the prior base classifiers. Using gating networks prevents the training instances from being repeatedly included in different subsets used for training base classifiers, being a key goal in achieving diversity. Furthermore, these are the gating networks that determine which classifiers’ output to be pooled for producing the final output. The performance of the proposed method is demonstrated and compared to Adaboost on four benchmarks from the UCI repository, and MNIST dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 72.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bias, variance, and arcing classifiers. Tech. Rep. 2 (1996)

    Google Scholar 

  2. Freund, Y., Schapire, R.E.: A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  3. Haykin, S.: Neural networks: a comprehensive foundation. Prentice Hall (1999)

    Google Scholar 

  4. Mitani, Y., Hamamoto, Y.: A local mean-based nonparametric classifier. Pattern Recognition Letters 27(10), 1151–1159 (2006)

    Article  Google Scholar 

  5. Muja, M., Lowe, D.G.: Fast approximate nearest neighbors with automatic algorithm configuration. In: International Conference on Computer Vision Theory and Application, VISSAPP 2009, pp. 331–340. INSTICC Press (2009)

    Google Scholar 

  6. Seiffert, C., Khoshgoftaar, T.M., Hulse, J.V., Napolitano, A.: Resampling or Reweighting: A Comparison of Boosting Implementations. In: 2008 20th IEEE International Conference on Tools with Artificial Intelligence, pp. 445–451 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yousefi, M.R., Breuel, T.M. (2012). Gated Boosting: Efficient Classifier Boosting and Combining. In: Glimm, B., KrĂĽger, A. (eds) KI 2012: Advances in Artificial Intelligence. KI 2012. Lecture Notes in Computer Science(), vol 7526. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33347-7_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33347-7_28

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33346-0

  • Online ISBN: 978-3-642-33347-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics