Skip to main content

Boosting

  • Reference work entry
  • First Online:
  • 42 Accesses

Definition

Boosting is a kind of ensemble methods [13] which produces a strong learner that is capable of making very accurate predictions by combining rough and moderately inaccurate learners (which are called as base learners or weak learners). In particular, boosting sequentially trains a series of base learners by using a base learning algorithm, where the training examples wrongly predicted by a base learner will receive more attention from the successive base learner. After that, it generates a final strong learner through a weighted combination of these base learners.

Historical Background

In 1989, Kearns and Valiant posed an interesting theoretical question, i.e., whether two complexity classes, weakly learnable and strongly learnable problems, are equal. In other words, whether a weak learning algorithm that performs just slightly better than random guess can be boosted into an arbitrarily accurate strong learning algorithm. In 1990, Schapire [9] proved that the answer to the...

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   4,499.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   6,499.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Recommended Reading

  1. Bauer E, Kohavi R. An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach Learn. 1999;36(1–2):105–39.

    Article  Google Scholar 

  2. Breiman L. Prediction games and arcing classifiers. Neural Comput. 1999;11(7):1493–517.

    Article  Google Scholar 

  3. Freund Y, Schapire RE. A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci. 1997;55(1):119–39 (A short version appeared in the Proceedings of EuroCOLT’95).

    Article  MathSciNet  MATH  Google Scholar 

  4. Friedman J, Hastie T, Tibshirani R. Additive logistic regression: a statistical view of boosting with discussions. Ann Stat. 2000;28(2):337–407.

    Article  MATH  Google Scholar 

  5. Gao W, Zhou Z-H. On the doubt about margin explanation of boosting. Artif Intell. 2013;203:1–18.

    Article  MathSciNet  MATH  Google Scholar 

  6. Meir R, Rätsch G. An introduction to boosting and leveraging. In: Mendelson S, Smola AJ, editors. Advanced lectures in machine learning. LNCS vol. 2600. Berlin: Springer; 2003. p. 118–83.

    Chapter  Google Scholar 

  7. Opitz D, Maclin R. Popular ensemble methods: an empirical study. J Artif Intell Res. 1999;11(1):169–98.

    Article  MATH  Google Scholar 

  8. Reyzin L, Schapire RE. How boosting the margin can also boost classifier complexity. In: Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh; 2006. p. 753–60.

    Google Scholar 

  9. Schapire RE. The strength of weak learn ability. Mach Learn. 1990;5(2):197–227.

    Google Scholar 

  10. Schapire RE, Freund Y, Bartlett P, Lee WS. Boosting the margin: a new explanation for the effectiveness of voting methods. Ann Stat. 1998;26(5):1651–86.

    Article  MathSciNet  MATH  Google Scholar 

  11. Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition; 2001. p. 511–8.

    Google Scholar 

  12. Wang L, Sugiyama M, Yang C, Zhou Z.H, Feng J. On the margin explanation of boosting algorithm. In: Proceedings of the 21st Annual Conference on Learning Theory; 2008. p. 479–90.

    Google Scholar 

  13. Zhou Z-H. Ensemble methods: foundations and algorithms. Boca Raton: CRC Press; 2012.

    Book  Google Scholar 

  14. Zhou Z-H. Large margin distribution learning. In: Proceedings of Artificial Neural Networks in Pattern Recognition; 2014.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhi-Hua Zhou .

Editor information

Editors and Affiliations

Section Editor information

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Science+Business Media, LLC, part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Zhou, ZH. (2018). Boosting. In: Liu, L., Özsu, M.T. (eds) Encyclopedia of Database Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8265-9_568

Download citation

Publish with us

Policies and ethics