Skip to main content

Cost-Sensitive Boosting: Fitting an Additive Asymmetric Logistic Regression Model

  • Conference paper
Advances in Machine Learning (ACML 2009)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5828))

Included in the following conference series:

Abstract

Conventional machine learning algorithms like boosting tend to equally treat misclassification errors that are not adequate to process certain cost-sensitive classification problems such as object detection. Although many cost-sensitive extensions of boosting by directly modifying the weighting strategy of correspond original algorithms have been proposed and reported, they are heuristic in nature and only proved effective by empirical results but lack sound theoretical analysis. This paper develops a framework from a statistical insight that can embody almost all existing cost-sensitive boosting algorithms: fitting an additive asymmetric logistic regression model by stage-wise optimization of certain criterions. Four cost-sensitive versions of boosting algorithms are derived, namely CSDA, CSRA, CSGA and CSLB which respectively correspond to Discrete AdaBoost, Real AdaBoost, Gentle AdaBoost and LogitBoost. Experimental results on the application of face detection have shown the effectiveness of the proposed learning framework in the reduction of the cumulative misclassification cost.

This work was partially supported by Open Fund of Province Level Key Laboratory for Colleges and Universities in Jiangsu Province under Grant No.KJS0802.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Freund, Y., Schapire, R.E.: A Decision-Theoretic Generalization of on-Line Learning and an Application on Boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  2. Schapire, R.E., Singer, Y.: Improved Boosting Algorithms Using Confidence-Rated Predictions. Journal of Machine Learning 37(3), 297–336 (1999)

    Article  MATH  Google Scholar 

  3. Freund, Y., Schapire, R.E.: Game Theory, on-Line Prediction and Boosting. In: Proceedings of the Ninth Annual Conference on Computational Learning Theory, pp. 325–332 (1996)

    Google Scholar 

  4. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.: Boosting the Margin: a New Explanation for the Effectiveness of Voting Methods. The Annals of Statistics 26(5), 1651–1686 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  5. Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. The Annals of Statistics 28(2), 337–407 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  6. Fan, W., Stolfo, S.J., Zhang, J., Chan, P.K.: Adacost: Misclassification Cost-Sensitive Boosting. In: ICML, pp. 97–105 (1999)

    Google Scholar 

  7. Ting, K.M.: A Comparative Study of Cost-Sensitive Boosting Algorithms. In: ICML, pp. 983–990 (2000)

    Google Scholar 

  8. Sun, Y., Kamel, M.S., Wong, A.K.C., Wang, Y.: Cost-Sensitive Boosting for Classification of Imbalanced Data. Pattern Recognition 40(12), 3358–3378 (2007)

    Article  MATH  Google Scholar 

  9. Masnadi-Shirazi, H., Vasconcelos, N.: Asymmetric Boosting. In: ICML, pp. 609–619 (2007)

    Google Scholar 

  10. Viola, P., Jones, M.: Fast and Robust Classification Using Asymmetric AdaBoost and a Detector Cascade. In: Neural Information Processing Systems, pp. 1311–1318 (2001)

    Google Scholar 

  11. Lienhart, R., Kuranov, A., Pisarevsky, V.: Empirical Analysis of Detection Cascades of Boosted Classifiers for Rapid Object Detection. In: Michaelis, B., Krell, G. (eds.) DAGM 2003. LNCS, vol. 2781, pp. 297–304. Springer, Heidelberg (2003)

    Google Scholar 

  12. Tuzel, O., Porikli, F., Meer, P.: Human Detection via Classification on Riemannian Manifolds. In: CVPR, pp. 1–8 (2007)

    Google Scholar 

  13. Viola, P., Jones, M.: Rapid Object Detection Using a Boosted Cascade of Simple Features. In: CVPR (2001)

    Google Scholar 

  14. Pham, M.T., Hoang, V.D., Cham, T.J.: Detection with Multi-Exit Asymmetric Boosting. In: CVPR (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, Qj., Mao, Yb., Wang, Zq., Xiang, Wb. (2009). Cost-Sensitive Boosting: Fitting an Additive Asymmetric Logistic Regression Model. In: Zhou, ZH., Washio, T. (eds) Advances in Machine Learning. ACML 2009. Lecture Notes in Computer Science(), vol 5828. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05224-8_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-05224-8_19

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-05223-1

  • Online ISBN: 978-3-642-05224-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics