Robust Asymmetric Adaboost
In real world pattern recognition problems, such as computer-assisted medical diagnosis, events of a given phenomena are usually found in minority, making it necessary to build algorithms that emphasize the effect of one of the classes at training time. In this paper we propose a variation of the well-known Adaboost algorithm that is able to improve its performance by using an asymmetric and robust cost function. We assess the performance of the proposed method on two medical datasets and synthetic datasets with different levels of imbalance and compare our results against three state-of-the-art ensemble learning approaches, achieving better and comparable results.
Keywordsensemble learning adaboost asymmetric cost functions robust methods
- 2.Frank, A., Asuncion, A.: UCI machine learning repository (2010), http://archive.ics.uci.edu/ml
- 3.Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting (1997)Google Scholar
- 5.Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning. Springer (2003)Google Scholar
- 6.Japkowicz, N., Myers, C., Gluck, M.: A novelty detection approach to classification. In: Proceedings of the Fourteenth Joint Conference on Artificial Intelligence, pp. 518–523 (1995)Google Scholar
- 12.Wang, Z., Fang, C., Ding, X.: Asymmetric real adaboost. In: ICPR 2008, pp. 1–4 (2008)Google Scholar