Abstract
We describe a new Boosting algorithm which combines the base hypotheses with symmetric functions. Among its properties of practical relevance, the algorithm has significant resistance against noise, and is efficient even in an agnostic learning setting. This last property is ruled out for voting-based Boosting algorithms like AdaBoost. Experiments carried out on thirty domains, most of which readily available, tend to display the reliability of the classifiers built.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning Journal, 36:105–139, 1999.
C. L. Blake, E. Keogh, and C. J. Merz. UCI repository of machine learning databases, 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.
S. Boucheron. Théorie de l’apprentissage, de l’approche formelle aux enjeux cognitifs. Hermes, 1992.
L. Breiman. Bagging predictors. Machine Learning Journal, 24:123–140, 1996.
L. Breiman. Bias, Variance and Arcing classifiers. Technical Report 460, UC Berkeley, 1996.
J. Friedman, T. Hastie, and R. Tibshirani. Additive Logistic Regression: A Statistical View of Boosting. Annals of Statistics, 28:337–374, 2000.
Y. Freund. Boosting a weak learning algorithm by majority. Information and Computation, 121:256–285, 1995.
Y. Freund and R. E. Schapire. A Decision-Theoretic generalization of online learning and an application to Boosting. Journal of Computer and System Sciences, 55:119–139, 1997.
K-U. Höffgen and H. U. Simon. Robust trainability of single neurons. In Proceedings of the 5 th International Conference on Computational Learning Theory, 1992.
M. J. Kearns. Thoughts on Hypothesis Boosting, 1988. ML class project.
M. J. Kearns and M. Li. Learning in the presence of malicious errors. In Proceedings of the 20 th ACM Symposium on the Theory of Computing, pages 267–280, 1988.
M. J. Kearns and Y. Mansour. On the boosting ability of top-down decision tree learning algorithms. Proceedings of the 28th Annual ACM Symposium on the Theory of Computing, pages 459–468, 1996.
M. J. Kearns, R. E. Schapire, and L. M. Sellie. Toward efficient agnostic learning. Machine Learning Journal, 17:115–141, 1994.
M. J. Kearns and L. Valiant. Cryptographic limitations on learning boolean formulae and finite automata. Proceedings of the 21 th ACM Symposium on the Theory of Computing, pages 433–444, 1989.
M. J. Kearns and U. V. Vazirani. An Introduction to Computational Learning Theory. M. I. T. Press, 1994.
R. Nock and O. Gascuel. On learning decision committees. In Proceedings of the 12 th International Conference on Machine Learning, pages 413–420, 1995.
R. Nock and P. Jappy. On the power of decision lists. In Proceedings of the 15 th International Conference on Machine Learning, pages 413–420, 1998.
J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann, 1994.
J. R. Quinlan. Bagging, Boosting and C4.5. In Proceedings of AAAI’96, pages 725–730, 1996.
R. E. Schapire. The strength of weak learnability. Machine Learning Journal, pages 197–227, 1990.
R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee. Boosting the Margin: A new explanation for the effectiveness of Voting methods. Annals of statistics, 26:1651–1686, 1998.
R. E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of the 11 th International Conference on Computational Learning Theory, pages 80–91, 1998.
L. G. Valiant. A theory of the learnable. Communications of the ACM, 27:1134–1142, 1984.
V. Vapnik. Statistical Learning Theory. John Wiley, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nock, R., Lefaucheur, P. (2002). A Robust Boosting Algorithm. In: Elomaa, T., Mannila, H., Toivonen, H. (eds) Machine Learning: ECML 2002. ECML 2002. Lecture Notes in Computer Science(), vol 2430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36755-1_27
Download citation
DOI: https://doi.org/10.1007/3-540-36755-1_27
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44036-9
Online ISBN: 978-3-540-36755-0
eBook Packages: Springer Book Archive