Skip to main content

Boosting with Averaged Weight Vectors

  • Conference paper
  • First Online:
Multiple Classifier Systems (MCS 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2709))

Included in the following conference series:

Abstract

AdaBoost [5] is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence [7]. The idea is to make the next base model’s errors uncorrelated with those of the previous model. Some researchers have pointed out the intuition that it is probably better to construct a distribution orthogonal to the mistake vectors of all the previous base models, but that this is not always possible [7]. We present an algorithm that attempts to come as close as possible to this goal in an efficient manner. We present experimental results demonstrating significant improvement over AdaBoost and the Totally Corrective boosting algorithm [7], which also attempts to satisfy this goal.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Eric Bauer and Ron Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36:105–139, Sep. 1999.

    Article  Google Scholar 

  2. C. Blake, E. Keogh, and C.J. Merz. UCI repository of machine learning databases, 1999. (URL: http://www.ics.uci.edu/~mlearn/MLRepository.html).

    Google Scholar 

  3. Y. Censor and A. Lent. An iterative row-action method for interval convex programming. Journal of Optimization Theory and Applications, 34(3):321–353, 1981.

    Article  MATH  MathSciNet  Google Scholar 

  4. Thomas G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40:139–158, Aug. 2000.

    Article  Google Scholar 

  5. Y. Freund and R. Schapire. Experiments with a new boosting algorithm. In Proceedings of the Thirteenth International Conference on Machine Learning, pages 148–156, Bari, Italy, 1996. Morgan Kaufmann.

    Google Scholar 

  6. Michael J. Kearns and Umesh V. Vazirani. Introduction to Computational Learning Theory. MIT Press, Cambridge, MA, 1994.

    Google Scholar 

  7. Jyrki Kivinen and Manfred K. Warmuth. Boosting as entropy projection. In Proceedings of the Twelfth Annual Conference on Computational Learning Theory, pages 134–144, 1999.

    Google Scholar 

  8. A. Krogh and J. Vedelsby. Neural network ensembles, cross validation and active learning. In G. Tesauro, D. S. Touretzky, and T. K. Leen, editors, Advances in Neural Information Processing Systems-7, pages 231–238. M.I.T. Press, 1995.

    Google Scholar 

  9. Samuel Kutin and Partha Niyogi. The interaction of stability and weakness in adaboost. Technical Report TR-2001-30, University of Chicago, October 2001.

    Google Scholar 

  10. Nikunj C. Oza. Online Ensemble Learning. PhD thesis, The University of California, Berkeley, CA, Dec 2001.

    Google Scholar 

  11. K. Tumer and J. Ghosh. Analysis of decision boundaries in linearly combined neural classifiers. Pattern Recognition, 29(2):341–348, February 1996.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Oza, N.C. (2003). Boosting with Averaged Weight Vectors. In: Windeatt, T., Roli, F. (eds) Multiple Classifier Systems. MCS 2003. Lecture Notes in Computer Science, vol 2709. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44938-8_2

Download citation

  • DOI: https://doi.org/10.1007/3-540-44938-8_2

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40369-2

  • Online ISBN: 978-3-540-44938-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics