Enhancing the Performance of AdaBoost Algorithms by Introducing a Frequency Counting Factor for Weight Distribution Updating
This work presents a modified Boosting algorithm capable of avoiding training sample overfitting during training procedures. The proposed algorithm updates weight distributions according to amount of misclassified samples at each iteration training step. Experimental tests reveal that our approach has several advantages over many classical AdaBoost algorithms in terms of error generalization capacity, overfitting avoidance and superior classification performance.
KeywordsAdaBoost Algorithm Weights Update Frequency Factor Misclassified Samples Machine Learning
- 3.Schapire, R.E.: The strength of weak learnability. Mach. Learn. 5, 197–227 (1990)Google Scholar
- 5.Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. The Annals of Statistics 38(2) (2000)Google Scholar
- 6.Vezhnevets, A., Vezhnevets, V.: ’modest adaboost’ - teaching adaboost to generalize better. In: GRAPHICONGoogle Scholar
- 12.Li, G., Xu, Y., Wang, J.: An improved adaboost face detection algorithm based on optimizing skin color model. In: 2010 Sixth International Conference on Natural Computation (ICNC), vol. 4, pp. 2013–2015 (August 2010)Google Scholar