Advertisement

Robust Asymmetric Adaboost

  • Pablo Ormeño
  • Felipe Ramírez
  • Carlos Valle
  • Héctor Allende-Cid
  • Héctor Allende
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7441)

Abstract

In real world pattern recognition problems, such as computer-assisted medical diagnosis, events of a given phenomena are usually found in minority, making it necessary to build algorithms that emphasize the effect of one of the classes at training time. In this paper we propose a variation of the well-known Adaboost algorithm that is able to improve its performance by using an asymmetric and robust cost function. We assess the performance of the proposed method on two medical datasets and synthetic datasets with different levels of imbalance and compare our results against three state-of-the-art ensemble learning approaches, achieving better and comparable results.

Keywords

ensemble learning adaboost asymmetric cost functions robust methods 

References

  1. 1.
    Allende-Cid, H., Salas, R., Allende, H., Ñanculef, R.: Robust Alternating AdaBoost. In: Rueda, L., Mery, D., Kittler, J. (eds.) CIARP 2007. LNCS, vol. 4756, pp. 427–436. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  2. 2.
    Frank, A., Asuncion, A.: UCI machine learning repository (2010), http://archive.ics.uci.edu/ml
  3. 3.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting (1997)Google Scholar
  4. 4.
    Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Annals of Statistics 28, 2000 (1998)MathSciNetGoogle Scholar
  5. 5.
    Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning. Springer (2003)Google Scholar
  6. 6.
    Japkowicz, N., Myers, C., Gluck, M.: A novelty detection approach to classification. In: Proceedings of the Fourteenth Joint Conference on Artificial Intelligence, pp. 518–523 (1995)Google Scholar
  7. 7.
    Japkowicz, N., Stephen, S.: The class imbalance problem: A systematic study. Intell. Data Anal. 6, 429–449 (2002)zbMATHGoogle Scholar
  8. 8.
    Kuncheva, L.I., Whitaker, C.J.: Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse. In: Roli, F., Kittler, J. (eds.) MCS 2002. LNCS, vol. 2364, pp. 81–90. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  9. 9.
    Li, X., Wang, L., Sung, E.: Adaboost with SVM-based component classifiers. Eng. Appl. Artif. Intell. 21(5), 785–795 (2008)CrossRefGoogle Scholar
  10. 10.
    Nock, R., Nielsen, F.: A generalization of discrete adaboost. Artificial Intelligence 171(1), 25–41 (2007)MathSciNetzbMATHCrossRefGoogle Scholar
  11. 11.
    Takenouchi, T., Eguchi, S.: Robustifying adaboost by adding the naive error rate. Neural Computation 16(4), 767–787 (2004)zbMATHCrossRefGoogle Scholar
  12. 12.
    Wang, Z., Fang, C., Ding, X.: Asymmetric real adaboost. In: ICPR 2008, pp. 1–4 (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Pablo Ormeño
    • 1
  • Felipe Ramírez
    • 1
  • Carlos Valle
    • 1
  • Héctor Allende-Cid
    • 1
  • Héctor Allende
    • 1
    • 2
  1. 1.Departamento de InformáticaUniversidad Técnica Federico Santa MaríaValparaísoChile
  2. 2.Factultad de Ingeniería y CienciaUniversidad Adolfo IbáñezViña del MarChile

Personalised recommendations