Abstract
Conventional machine learning algorithms like boosting tend to equally treat misclassification errors that are not adequate to process certain cost-sensitive classification problems such as object detection. Although many cost-sensitive extensions of boosting by directly modifying the weighting strategy of correspond original algorithms have been proposed and reported, they are heuristic in nature and only proved effective by empirical results but lack sound theoretical analysis. This paper develops a framework from a statistical insight that can embody almost all existing cost-sensitive boosting algorithms: fitting an additive asymmetric logistic regression model by stage-wise optimization of certain criterions. Four cost-sensitive versions of boosting algorithms are derived, namely CSDA, CSRA, CSGA and CSLB which respectively correspond to Discrete AdaBoost, Real AdaBoost, Gentle AdaBoost and LogitBoost. Experimental results on the application of face detection have shown the effectiveness of the proposed learning framework in the reduction of the cumulative misclassification cost.
This work was partially supported by Open Fund of Province Level Key Laboratory for Colleges and Universities in Jiangsu Province under Grant No.KJS0802.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Freund, Y., Schapire, R.E.: A Decision-Theoretic Generalization of on-Line Learning and an Application on Boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)
Schapire, R.E., Singer, Y.: Improved Boosting Algorithms Using Confidence-Rated Predictions. Journal of Machine Learning 37(3), 297–336 (1999)
Freund, Y., Schapire, R.E.: Game Theory, on-Line Prediction and Boosting. In: Proceedings of the Ninth Annual Conference on Computational Learning Theory, pp. 325–332 (1996)
Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.: Boosting the Margin: a New Explanation for the Effectiveness of Voting Methods. The Annals of Statistics 26(5), 1651–1686 (1998)
Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. The Annals of Statistics 28(2), 337–407 (2000)
Fan, W., Stolfo, S.J., Zhang, J., Chan, P.K.: Adacost: Misclassification Cost-Sensitive Boosting. In: ICML, pp. 97–105 (1999)
Ting, K.M.: A Comparative Study of Cost-Sensitive Boosting Algorithms. In: ICML, pp. 983–990 (2000)
Sun, Y., Kamel, M.S., Wong, A.K.C., Wang, Y.: Cost-Sensitive Boosting for Classification of Imbalanced Data. Pattern Recognition 40(12), 3358–3378 (2007)
Masnadi-Shirazi, H., Vasconcelos, N.: Asymmetric Boosting. In: ICML, pp. 609–619 (2007)
Viola, P., Jones, M.: Fast and Robust Classification Using Asymmetric AdaBoost and a Detector Cascade. In: Neural Information Processing Systems, pp. 1311–1318 (2001)
Lienhart, R., Kuranov, A., Pisarevsky, V.: Empirical Analysis of Detection Cascades of Boosted Classifiers for Rapid Object Detection. In: Michaelis, B., Krell, G. (eds.) DAGM 2003. LNCS, vol. 2781, pp. 297–304. Springer, Heidelberg (2003)
Tuzel, O., Porikli, F., Meer, P.: Human Detection via Classification on Riemannian Manifolds. In: CVPR, pp. 1–8 (2007)
Viola, P., Jones, M.: Rapid Object Detection Using a Boosted Cascade of Simple Features. In: CVPR (2001)
Pham, M.T., Hoang, V.D., Cham, T.J.: Detection with Multi-Exit Asymmetric Boosting. In: CVPR (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, Qj., Mao, Yb., Wang, Zq., Xiang, Wb. (2009). Cost-Sensitive Boosting: Fitting an Additive Asymmetric Logistic Regression Model. In: Zhou, ZH., Washio, T. (eds) Advances in Machine Learning. ACML 2009. Lecture Notes in Computer Science(), vol 5828. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05224-8_19
Download citation
DOI: https://doi.org/10.1007/978-3-642-05224-8_19
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-05223-1
Online ISBN: 978-3-642-05224-8
eBook Packages: Computer ScienceComputer Science (R0)