Advertisement

Enhancing the Performance of AdaBoost Algorithms by Introducing a Frequency Counting Factor for Weight Distribution Updating

  • Diego Alonso Fernández Merjildo
  • Lee Luan Ling
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7441)

Abstract

This work presents a modified Boosting algorithm capable of avoiding training sample overfitting during training procedures. The proposed algorithm updates weight distributions according to amount of misclassified samples at each iteration training step. Experimental tests reveal that our approach has several advantages over many classical AdaBoost algorithms in terms of error generalization capacity, overfitting avoidance and superior classification performance.

Keywords

AdaBoost Algorithm Weights Update Frequency Factor Misclassified Samples Machine Learning 

References

  1. 1.
    Freund, Y., Schapire, R.E.: A Decision-Theoretic Generalization of On-line Learning and an Application to Boosting. In: Vitányi, P.M.B. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 23–37. Springer, Heidelberg (1995)CrossRefGoogle Scholar
  2. 2.
    Valiant, L.G.: A theory of the learnable. Commun. ACM 27, 1134–1142 (1984)zbMATHCrossRefGoogle Scholar
  3. 3.
    Schapire, R.E.: The strength of weak learnability. Mach. Learn. 5, 197–227 (1990)Google Scholar
  4. 4.
    Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Mach. Learn. 37, 297–336 (1999)zbMATHCrossRefGoogle Scholar
  5. 5.
    Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. The Annals of Statistics 38(2) (2000)Google Scholar
  6. 6.
    Vezhnevets, A., Vezhnevets, V.: ’modest adaboost’ - teaching adaboost to generalize better. In: GRAPHICONGoogle Scholar
  7. 7.
    Li, S., Zhang, Z.: Floatboost learning and statistical face detection. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(9), 1112–1123 (2004)CrossRefGoogle Scholar
  8. 8.
    Gómez-Verdejo, V., Ortega-Moral, M., Arenas-García, J., Figueiras-Vidal, A.R.: Boosting by weighting critical and erroneous samples. Neurocomput. 69, 679–685 (2006)CrossRefGoogle Scholar
  9. 9.
    Dietterich, T.G.: Ensemble Methods in Machine Learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–15. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  10. 10.
    Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for adaboost. Mach. Learn. 42, 287–320 (2001)zbMATHCrossRefGoogle Scholar
  11. 11.
    Servedio, R.A.: Smooth boosting and learning with malicious noise. J. Mach. Learn. Res. 4, 633–648 (2003)MathSciNetGoogle Scholar
  12. 12.
    Li, G., Xu, Y., Wang, J.: An improved adaboost face detection algorithm based on optimizing skin color model. In: 2010 Sixth International Conference on Natural Computation (ICNC), vol. 4, pp. 2013–2015 (August 2010)Google Scholar
  13. 13.
    Chang, C.C., Lin, C.J.: LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 27:1–27:27 (2011), Software http://www.csie.ntu.edu.tw/~cjlin/libsvm Google Scholar
  14. 14.
    Breiman, L.: Arcing classifiers. Annals of Statistics 26(3), 801–824 (1998)MathSciNetzbMATHCrossRefGoogle Scholar
  15. 15.
    Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: a new explanation for the effectiveness of voting methods. Annals of Statistics 26(5), 1651–1686 (1998)MathSciNetzbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Diego Alonso Fernández Merjildo
    • 1
  • Lee Luan Ling
    • 1
  1. 1.Department of Communications, DECOM, School of Electrical and Computer Engineering, FEECState University of Campinas, UNICAMPCampinasBrazil

Personalised recommendations