Abstract
Boosting is a class of machine learning methods based on the idea that a combination of simple classifiers (obtained by a weak learner) can perform better than any of the simple classifiers alone. A weak learner (WL) is a learning algorithm capable of producing classifiers with probability of error strictly (but only slightly) less than that of random guessing (0.5, in the binary case). On the other hand, a strong learner (SL) is able (given enough training data) to yield classifiers with arbitrarily small error probability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We refer to a classifier learned by a WL as a weak classifier.
- 2.
A given instance can be classified into one or more classes.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
As of version PRTools 4.0, available at the time of this writing (July, 2011).
- 12.
- 13.
- 14.
- 15.
References
D. Aha, D. Kibler, and M. Albert. Instance-based learning algorithms. In Machine Learning, pages 37–66, 1991.
E. Allwein, R. Schapire, and Y. Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of Machine Learning Research, 1:113–141, 2000.
R. Avnimelech and N. Intrator. Boosting regression estimators. Neural Computation, 11:491–513, 1999.
B. Babenko, M. Yang, and S. Belongie. A family of online boosting algorithms. In Learning09, pages 1346–1353, 2009.
E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36:105–139, 1999.
K. Bennett, A. Demiriz, and R. Maclin. Exploiting unlabeled data in ensemble methods. In Proceedings of the Eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’02, pages 289–296, New York, NY, USA, 2002. ACM.
J. Bergstra and B. Kégl. Meta-features and adaboost for music classification. In Machine Learning Journal : Special Issue on Machine Learning in Music, 2006.
C. Blake and C. Merz. UCI repository of machine learning databases. Technical report, University of California, Irvine, Department of Informatics and Computer Science, 1999.
A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth. Learnability and the vapnik-chervonenkis dimension. Journal of the ACM, 36:929–965, 1989.
B. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In Proc. of the 5th Annual ACM Workshop on Computational Learning Theory, pages 144–152, New York, NY, USA, 1992. ACM Press.
L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.
L. Breiman. Bias, variance, and arcing classifiers. Technical report, UC Berkeley, CA, 1996.
P. Bühlmann and B.Yu. Analyzing bagging. Annals of Statistics, 30:927–961, 2002.
W.-C. Chang and C.-W. Cho. Online Boosting for Vehicle Detection. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 40(3):892–902, June 2010.
O. Chapelle, B. Schalkopf, and A. Zien. Semi-Supervised Learning. The MIT Press, Cambridge, MA, 2006.
K. Chen, C. Chou, S. Shih, W. Chen, and D. Chen. Feature selection for iris recognition with adaboost. Third International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2007. IIHMSP 2007, 2:411–414, 2007.
K. Chen and S. Wang. Regularized boost for semi-supervised learning. In Neural Information Processing Systems, 2007.
K. Chen and S. Wang. Semi-supervised Learning via Regularized Boosting Working on Multiple Semi-supervised Assumptions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 99(1), 2010.
H. Chouaib, O. Terrades, S. Tabbone, F. Cloppet, and N. Vincent. Feature selection combining genetic algorithm and adaboost classifiers. In ICPR08, pages 1–4, 2008.
N. Christiani and J. Shawe-Taylor. An Introduction to Support Vector Machines and other kernel based learning methods. Cambridge University Press, Cambridge, MA, 2000.
M. Collins, R. Schapire, and Y. Singer. Logistic regression, adaboost and bregman distances. In Machine Learning, volume 48, pages 158–169, 2000.
T. Cover and J. Thomas. Elements of Information Theory. John Wiley & Sons, 1991.
K. Crammer and Y. Singer. A new family of online algorithms for category ranking. In Proceedings of the 25th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’02, pages 151–158, New York, NY, USA, 2002. ACM.
K. Crammer, Y. Singer, J. K., T. Hofmann, T. Poggio, and J. Shawe-Taylor. A family of additive online algorithms for category ranking. Journal of Machine Learning Research, 3:2003, 2003.
D. Cristinacce and T. Cootes. Facial feature detection using adaboost with shape constraints. In British Machine Vision Conference (BMVC), pages 231–240, 2003.
F. d’Alché-Buc, Y. Grandvalet, and C. Ambroise. Semi-supervised MarginBoost. In Advances in Neural Information Processing Systems 14, pages 553–560, MIT Press, Cambridge, MA, 2001.
A. Demiriz, K. P. Bennett, and J. S. Taylor. Linear Programming Boosting via Column Generation. Machine Learning, 46(1-3):225–254, 2002.
A. Dempster, N. Laird, and D. Rubin. Maximum likelihood from incomplete data via the EM algorithm (with discussion). Journal of the Royal Statistical Society, B, 39:1–38, 1977.
T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139–157, 2000.
C. Domingo and O. Watanabe. MadaBoost: A modification of AdaBoost. In Proceedings of the 13th Annual Conference on Computational Learning Theory (COLT), pages 180–189, Palo Alto, CA, 2000.
H. Drucker. Improving regressors using boosting techniques. In Proceedings of the Fourteenth International Conference on Machine Learning ICML 97, pages 107–115, San Francisco, CA, USA, 1997. Morgan Kaufmann Publishers Inc.
H. Drucker, R. Schapire, and P. Simard. Improving performance in neural networks using a boosting algorithm. In Advances in Neural Information Processing Systems 5, [NIPS Conference], pages 42–49, San Francisco, CA, USA, 1993. Morgan Kaufmann Publishers Inc.
H. Drucker and C. Tortes. Boosting decision trees. In Advances in Neural Information Processing Systems, volume 8, pages 479–485. MIT Press, 1996.
R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, 2nd edition, 2001.
N. Duffy and D. Helmbold. Potential boosters? In Advances in Neural Information Processing Systems 12, pages 258–264. MIT Press, New York, NY, 2000.
R. Duin, P. Juszczak, P. Paclik, E. Pekalska, D. Ridder, D. Tax, and S. Verzakov. PRTools4.1, a Matlab Toolbox for Pattern Recognition. Technical report, Delft University of Technology, 2007.
B. Efron. The jackknife, the bootstrap and other resampling plans. Society for Industrial and Applied Mathematics (SIAM), 1982.
B. Efron and R. Tibshirani. An Introduction to the Bootstrap. Chapman & Hall, New York, 1993.
G. Eibl and K. Pfeiffer. How to make adaboost.M1 work for weak classifiers by changing only one line of the code. In Machine Learning: Thirteenth European Conference, volume 1, pages 109–120, 2002.
G. Eibl and K. Pfeiffer. Multiclass boosting for weak classifiers. Journal of Machine Learning Research, 6:189–210, 2005.
A. Esuli, T. Fagni, and F. Sebastiani. MP-Boost: A multiple-pivot boosting algorithm and its application to text categorization. In Proceedings of the 13th International Symposium on String Processing and Information Retrieval (SPIRE’06), 2006.
A. Ferreira and M. Figueiredo. Boosting of (very) weak classifiers. In 6th Portuguese Conference on Telecommunications, Conftele’07, Peniche, Portugal, 2007.
F. Fleuret. Multi-layer boosting for pattern recognition. Pattern Recognition Letters, 30:237–241, February 2009.
Y. Freund. Boosting a Weak Learning Algorithm by Majority. Information and Computation, 121(2):256–285, 1995.
Y. Freund. An adaptive version of the boost by majority algorithm. In Proceedings of the Twelfth Annual Conference on Computational Learning Theory, pages 102–113, 2000.
Y. Freund. A more robust boosting algorithm. http://arxiv.org/abs/0905.2138, 2009.
Y. Freund and R. Schapire. A decision-theoretic generalization of on-line learning and application to boosting. In European Conference on Computational Learning Theory – EuroCOLT. Springer, 1994.
Y. Freund and R. Schapire. Experiments with a new boosting algorithm. In Thirteenth International Conference on Machine Learning, pages 148–156, Bari, Italy, 1996.
Y. Freund and R. Schapire. Game theory, on-line prediction and boosting. In Proceedings of the Ninth Annual Conference on Computational Learning Theory, pages 325–332. ACM Press, 1996.
Y. Freund and R. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, 1997.
J. Friedman. Greedy function approximation: A gradient boosting machine. Annals of Statistics, 29:1189–1232, 2000.
J. Friedman, T. Hastie, and R. Tibshirani. Additive logistic regression: a statistical view of boosting. The Annals of Statistics, 28(2):337–374, 2000.
V. Gómez-Verdejo, M. Ortega-Moral, J. Arenas-Gárcia, and A. Figueiras-Vidal. Boosting of weighting critical and erroneous samples. Neurocomputing, 69(7–9):679–685, 2006.
H. Grabner, C. Leistner, and H. Bischof. Semi-supervised On-Line Boosting for Robust Tracking. In D. Forsyth, P. Torr, and A. Zisserman, editors, Computer Vision ECCV 2008, volume 5302, Lecture Notes in Computer Science, chapter 19, pages 234–247. Springer, Berlin, Heidelberg, 2008.
Y. Grandvalet, F. Buc, and C. Ambroise. Boosting mixture models for semi-supervised learning. In ICANN International Conference on Artificial Neural Networks, volume 1, pages 41–48, Vienna, Austria, 2001.
V. Guruswami and A. Sahai. Multiclass learning, boosting, and error-correcting codes. In 12th Annual Conference on Computational Learning Theory (COLT-99), Santa Cruz, USA, 1999.
Z. Hao, C. Shen, N. Barnes, and B. Wang. Totally-corrective multi-class boosting. In ACCV10, volume 6495/2011, Lecture Notes in Computer Science, pages 269–280, 2011.
T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Springer, 2nd edition, New York, NY, 2001.
J. He and B. Thiesson. Asymmetric gradient boosting with application to SPAM filtering. In Fourth Conference on Email and Anti-Spam (CEAS) 2007, August 2–3, Mountain View, California, USA.
C. Huang, H. Ai, Y. Li, and S. Lao. Vector boosting for rotation invariant multi-view face detection. In International Conference on Computer Vision (ICCV), volume 1, pages 446–453, 2005.
X. Huang, S. Li, and Y. Wang. Jensen–Shannon boosting learning for object recognition. In International Conference on Computer Vision and Pattern Recognition (CVPR), volume 2, pages 144–149, 2005.
J. Jackson and M. Craven. Learning sparse perceptrons. In Advances in Neural Information Processing Systems, volume 8, pages 654–660. MIT Press, 1996.
R. Jin, Y. Liu, L. Si, J. Carbonell, and A. Hauptmann. A new boosting algorithm using input-dependent regularizer. In Proceedings of Twentieth International Conference on Machine Learning (ICML 03). AAAI Press, 2003.
X. Jin, X. Hou, and C.-L. Liu. Multi-class adaboost with hypothesis margin. In Proceedings of the 2010 20th International Conference on Pattern Recognition, ICPR ’10, pages 65–68, Washington, DC, USA, 2010. IEEE Computer Society.
G. Jun and J. Ghosh. Multi-class boosting with class hierarchies. Multiple Classifier Systems, 5519:32–41, 2009.
H. Kong and E. Teoh. Coupling adaboost and random subspace for diversified fisher linear discriminant. In International Conference on Control, Automation, Robotics and Vision (ICARCV) 06, pages 1–5, 2006.
L. Kuncheva. Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken, NJ, 2004.
D. Le and S. Satoh. Ent-boost: Boosting using entropy measures for robust object detection. Pattern Recognition Letters, 2007.
H. Li and C. Shen. Boosting the minimum margin: LPBoost vs. AdaBoost. Digital Image Computing: Techniques and Applications, 0:533–539, 2008.
L. Li. Multiclass boosting with repartioning. In 23rd International Conference on Machine Learning (ICML 07), Pennsylvania, USA, 2006.
P. Li. ABC-Boost: adaptive base class boost for multi-class classification. In International Conference on Machine Learning (ICML), pages 79–632, 2009.
S. Li and A. Jain. Handbook of Face Recognition. Springer, New York, NY, 2005.
S. Li and Z. Zhang. Floatboost learning and statistical face detection. Transactions on Pattern Analysis and Machine Intelligence, 26(9):23–38, 2004.
C. Liu and H. Shum. Kullback–Leibler boosting. In International Conference on Computer Vision and Pattern Recognition (CVPR), volume 1, pages 587–594, Madison, Wisconsin, USA, 2003.
Y. Lu, Q. Tian, and T. Huang. Interactive boosting for image classification. In Proceedings of the 7th International Conference on Multiple Classifier Systems, MCS’07, pages 180–189, Berlin, Heidelberg, 2007. Springer-Verlag.
S. Lyu. Infomax boosting. In International Conference on Computer Vision and Pattern Recognition (CVPR), volume 1, pages 533–538, 2005.
R. Maclin. An empirical evaluation of bagging and boosting. In Proceedings of the Fourteenth National Conference on Artificial Intelligence, pages 546–551. AAAI Press, 1997.
P. Mallapragada, R. Jin, A. Jain, and Y. Liu. SemiBoost: Boosting for Semi-Supervised Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(11):2000–2014, 2009.
H. Masnadi-Shirazi and N. Vasconcelos. Asymmetric boosting. In Proceedings of the 24th International Conference on Machine Learning, (ICML), pages 609–619, New York, NY, USA, 2007. ACM.
L. Mason, J. Baxter, P. Bartlett, and M. Frean. Boosting algorithms as gradient descent. In Advances in Neural Information Processing Systems 12, pages 512–518, MIT Press, Cambridge, MA, 1999.
L. Mason, J. Baxter, P. Bartlett, and M. Frean. Functional gradient techniques for combining hypotheses. Advances in Large Margin Classifiers, 1:109–120, 2000.
R. Meir and G. Rätsch. An introduction to boosting and leveraging. In S. Mendelson and A. Smola, editors, Advanced Lectures on Machine Learning. Springer Verlag, 2006.
A. Mohemmed, M. Zhang, and M. Johnston. A PSO Based Adaboost Approach to Object Detection. Simulated Evolution and Learning, pages 81–90, 2008.
Z. Niu, S. Shan, S. Yan, X. Chen, and W. Gao. 2d cascaded adaboost for eye localization. In 18th International Conference on Pattern Recognition, 2006.
R. Nock and P. Lefaucheur. A Robust Boosting Algorithm. In T. Elomaa, H. Mannila, and H. Toivonen, editors, Machine Learning: ECML 2002, volume 2430 of Lecture Notes in Computer Science, pages 319–331, 2002. Springer Berlin/Heidelberg.
T. Pham and A. Smeulders. Quadratic boosting. Pattern Recoginition, 41:331–341, January 2008.
A. Quddus, P. Fieguth, and O. Basir. Adaboost and Support Vector Machines for White Matter Lesion Segmentation in MR Images. In 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, pages 463–466. IEEE, 2005.
J. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.
J. Quinlan. Bagging, Boosting, and C4.5. In Proceedings of the Thirteenth National Conference on Artificial Intelligence, pages 725–730, 1996.
G. Rätsch. Robust multi-class boosting. In Eurospeech 2003, 8th European Conference on Speech Communication and Technology, page 9971000, Geneva, Switzerland, 2003.
B. Ripley. Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge, MA, 1996.
J. Rodriguez and J. Maudes. Boosting recombined weak classifiers. Pattern Recognition Letters, 29(8):1049–1059, 2007.
A. Saffari, H. Grabner, and H. Bischof. Serboost: Semi-supervised boosting with expectation regularization. In D. Forsyth, P. Torr, and A. Zisserman, editors, Computer Vision European Conference on Computer Vision (ECCV) 2008, volume 5304 Lecture Notes in Computer Science, pages 588–601. Springer Berlin/Heidelberg, 2008.
A. Saffari, C. Leistner, M. Godec, and H. Bischof. Robust multi-view boosting with priors. In 11th European Conference on Computer Vision (ECCV), pages 776–789, Berlin, Heidelberg, 2010. Springer-Verlag.
R. Schapire. The strength of weak learnability. In Machine Learning, volume 5, pages 197–227, 1990.
R. Schapire. Using output codes to boost multiclass learning problems. In 14th International Conference on Machine Learning (ICML), pages 313–321, Tennessee, USA, 1997.
R. Schapire. Theoretical views of boosting. In Proceedings of the 4th European Conference on Computational Learning Theory, EuroCOLT ’99, pages 1–10, London, UK, 1999. Springer-Verlag.
R. Schapire. The boosting approach to machine learning: An overview. In Nonlinear Estimation and Classification, Berkeley, 2002. Springer.
R. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. In Proceedings of the 14th International Conference on Machine Learning (ICML), pages 322–330, Nashville, TN, 1997.
R. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3):297–336, 1999.
R. Schapire and Y. Singer. BoosTexter: A Boosting-based System for Text Categorization. Machine Learning, 39(2/3):135–168, 2000.
B. Schölkopf and A. Smola. Learning with Kernels. MIT Press, 2002.
H. Schwenk and Y. Bengio. Adaboosting neural networks: Application to on-line character recognition. In International Conference on Artificial Neural Networks’97, LNCS, 1327, 967–972, pages 967–972. Springer, 1997.
H. Schwenk and Y. Bengio. Boosting Neural Networks. Neural Comp., 12(8):1869–1887, 2000.
C. Shen, J. Kim, L. Wang, and A. van den Hengel. Positive semidefinite metric learning with boosting. In Y. Bengio, D. Schuurmans, J. Lafferty, C. Williams, and A. Culotta, editors, Advances in Neural Information Processing Systems (NIPS’09), pages 1651–1659, Vancouver, BC, Canada, December 2009. MIT Press.
J. Sochman and J. Matas. “Waldboost” learning for time constrained sequential detection. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) – Volume 2, pages 150–156, Washington, DC, USA, 2005. IEEE Computer Society.
A. Stefan, V. Athitsos, Q. Yuan, and S. Sclaroff. Reducing jointboost-based multiclass classification to proximity search. In Computer Vision and Pattern Recognition (CVPR), pages 589–596. IEEE, 2009.
S. Sternig, M. Godec, P. Roth, and H. Bischof. Transientboost: On-line boosting with transient data. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2010, pages 22–27, San Francisco, CA.
Y. Sun, S. Todorovic, and J. Li. Unifying multi-class adaboost algorithms with binary base learners under the margin framework. Pattern Recognition Letters, 28:631–643, 2007.
Y. Sun, S. Todorovic, J. Li, and D. Wu. Unifying the error-correcting and output-code adaboost within the margin framework. In Proceedings of the 22nd International Conference on Machine Learning (ICML), pages 872–879, New York, NY, USA, 2005. ACM.
J. Thongkam, O. Xu, Y. Zhang, F. Huang, and G. Adaboosts. Breast cancer survivability via adaboost algorithms. In Proceedings of the second Australasian workshop on Health data and knowledge management – Volume 80, HDKM ’08, pages 55–64, Darlinghurst, Australia, 2008. Australian Computer Society, Inc.
K. Tieu and P. Viola. Boosting image retrieval. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition – CVPR, volume 1, pages 228–235, 2000.
A. Torralba, K. Murphy, and W. Freeman. Sharing visual features for multiclass and multiview object detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5):854 – 869, March 2007.
L. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134–1142, 1984.
H. Valizadegan, R. Jin, and A. K. Jain. Semi-Supervised Boosting for Multi-Class Classification. In ECML PKDD ’08: Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases – Part II, pages 522–537, Berlin, Heidelberg, 2008. Springer-Verlag.
V. Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, New York, NY, 1999.
V. Vapnik and A. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, 16(2):264–280, 1971.
R. Verschae, J. Ruiz-del-solar, and M. Correa. Gender classification of faces using adaboost. In Lecture Notes in Computer Science (CIARP 2006) 4225, page 78. Springer, 2006.
A. Vezhnevets and V. Vezhnevets. Modest AdaBoost - teaching AdaBoost to generalize better. Graphicon, 12(5):987–997, 2005.
P. Viola and M. Jones. Rapid object detection using a boosted cascade of simple features. In International Conference on Computer Vision and Pattern Recognition (CVPR), volume 1, pages 511–518, Hawaii, 2001.
P. Viola and M. Jones. Robust real-time face detection. International Journal of Computer Vision, 57:137–154, 2004.
P. Viola, M. Jones, and D. Snow. Detecting pedestrians using patterns of motion and appearance. In International Conference on Computer Vision – ICCV, pages 734–741, 2003.
P. Viola, J. Platt, and C. Zhang. Multiple instance boosting for object detection. In Y. Weiss, B. Schölkopf, and J. Platt, editors, Advances in Neural Information Processing Systems 18, pages 1417–1424, Cambridge, MA, 2006. MIT Press.
L. Wang, S. Yuan, L. Li, and H. Li. Boosting naïve Bayes by active learning. In Third International Conference on Machine Learning and Cybernetics, volume 1, pages 41–48, Shanghai, China, 2004.
P. Wang, C. Shen, N. Barnes, H. Zheng, and Z. Ren. Asymmetric totally-corrective boosting for real-time object detection. In Asian Conference on Computer Vision (ACCV), pages I: 176–188, 2010.
M. Warmuth, K. Glocer, and G. Rätsch. Boosting algorithms for maximizing the soft margin. In Advances in Neural Information Processing Systems NIPS, pages 1–8, MIT Press, 2007.
M. Warmuth, K. Glocer, and S. Vishwanathan. Entropy regularized LPBoost. In Proceedings of the 19th International Conference on Algorithmic Learning Theory, ALT ’08, pages 256–271, Springer-Verlag, Berlin, Heidelberg, 2008.
M. Warmuth, J. Liao, and G. Rätsch. Totally corrective boosting algorithms that maximize the margin. In Proceedings of the 23rd International Conference on Machine Learning (ICML), pages 1001–1008, New York, NY, USA, 2006. ACM.
J. Warrell, P. Torr, and S. Prince. Styp-boost: A bilinear boosting algorithm for learning style-parameterized classifiers. In British Machine Vision Conference (BMVC), 2010.
J. Webb, J. Boughton, and Z. Wang. Not so naïve Bayes: Aggregating one-dependence estimators. Machine Learning, 58(1):5–24, 2005.
P. Yang, S. Shan, W. Gao, S. Z. Li, and D. Zhang. Face recognition using ada-boosted gabor features. In Proceedings of the 16th International Conference on Face and Gesture Recognition, pages 356–361, 2004.
C. Zhang, P. Yin, Y. Rui, R. Cutler, P. Viola, X. Sun, N. Pinto, and Z. Zhang. Boosting-based multimodal speaker detection for distributed meeting videos. IEEE Transactions on Multimedia, 10(8):1541–1552, December 2008.
C. Zhang and Z. Zhang. Boosting-Based Face Detection and Adaptation. Morgan and Claypool Publishers, 2010.
C. Zhang and J. Zhang. Rotboost: A technique for combining rotation forest and adaboost. Pattern Recognition Letters, 29(10):1524–1536, July 2008.
C. Zhang and Z. Zhang. Winner-take-all multiple category boosting for multi-view face detection. Technical report, One Microsoft Way, Redmond, WA 98052, USA, 2010.
L. Zheng, S. Wang, Y. Liu, and C.-H. Lee. Information theoretic regularization for semi-supervised boosting. In Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’09, pages 1017–1026, New York, NY, USA, 2009. ACM.
M. Zhou, H. Wei, and S. Maybank. Gabor wavelets and AdaBoost in feature selection for face verification. In Proceedings of the Workshop on Applications of Computer Visions, pages 101–109, Graz, Austria, 2006.
J. Zhu, H. Zou, S. Rosset, and T. Hastie. Multi-class adaboost. Statistics and Its Interface, 2:349–360, 2009.
X. Zhu, C. Bao, and W. Qiu. Bagging very weak learners with lazy local learning. In International Conference on Pattern Recognition (ICPR), pages 1–4, 2008.
X. Zhu and Y. Yang. A lazy bagging approach to classification. Pattern Recognition, 41:2980–2992, 2008.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Ferreira, A.J., Figueiredo, M.A.T. (2012). Boosting Algorithms: A Review of Methods, Theory, and Applications. In: Zhang, C., Ma, Y. (eds) Ensemble Machine Learning. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9326-7_2
Download citation
DOI: https://doi.org/10.1007/978-1-4419-9326-7_2
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-9325-0
Online ISBN: 978-1-4419-9326-7
eBook Packages: EngineeringEngineering (R0)