A Linear Combination of Classifiers via Rank Margin Maximization

  • Claudio Marrocco
  • Paolo Simeone
  • Francesco Tortorella
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6218)

Abstract

The method we present aims at building a weighted linear combination of already trained dichotomizers, where the weights are determined to maximize the minimum rank margin of the resulting ranking system. This is particularly suited for real applications where it is difficult to exactly determine key parameters such as costs and priors. In such cases ranking is needed rather than classification. A ranker can be seen as a more basic system than a classifier since it ranks the samples according to the value assigned by the classifier to each of them. Experiments on popular benchmarks along with a comparison with other typical rankers are proposed to show how effective can be the approach.

Keywords

Margin Ranking Combination of Classifiers 

References

  1. 1.
    Asuncion, A., Newman, D.J.: UCI machine learning repository (2007)Google Scholar
  2. 2.
    Breiman, L.: Bagging predictors. Machine Learning 26(2), 123–140 (1996)Google Scholar
  3. 3.
    Canu, S., Grandvalet, Y., Guigue, V., Rakotomamonjy, A.: Svm and kernel methods matlab toolbox. Perception Systèmes et Information, INSA de Rouen, Rouen, France (2005)Google Scholar
  4. 4.
    Crammer, K., Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin analysis of the lvq algorithm. In: NIPS, pp. 462–469 (2002)Google Scholar
  5. 5.
    Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research (7), 1–30 (2006)Google Scholar
  6. 6.
    Freund, Y., Iyer, R., Schapire, R.E., Singer, Y.: An efficient boosting algorithm for combining preferences. Journal of Machine Learning Research 4, 933 (2003)CrossRefMathSciNetGoogle Scholar
  7. 7.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of online learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119 (1997)MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Fumera, G., Roli, F.: A theoretical and experimental analysis of linear combiners for multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(6), 942 (2005)CrossRefGoogle Scholar
  9. 9.
    Holm, S.: A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics 6, 65–70 (1979)MATHMathSciNetGoogle Scholar
  10. 10.
    Joachims, T.: SVM light (2002), http://svmlight.joachims.org
  11. 11.
    Kuncheva, L.I.: Combining Pattern Classifiers. Methods and Algorithms. John Wiley & Sons, Chichester (2004)MATHCrossRefGoogle Scholar
  12. 12.
    Rudin, C., Cortes, C., Mohri, M., Schapire, R.: Margin-based ranking meets boosting in the middle. In: Proceedings of 18th Annual Conference on Computational Learning Theory (2005)Google Scholar
  13. 13.
    Schapire, R.E., Freund, Y., Barlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. In: ICML, pp. 322–330 (1997)Google Scholar
  14. 14.
    Sheskin, D.J.: Handbook of parametric and nonparametric statistical procedures. Chapman & Hall, CRC (2000)Google Scholar
  15. 15.
    Vanderbei, R.J.: Linear Programming: Foundations and Extensions, 2nd edn. Springer, Heidelberg (2001)MATHGoogle Scholar
  16. 16.
    Vezhnevets, A., Vezhnevets, V.: Modest adaboost - teaching adaboost to generalize better. In: Graphicon 2005 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Claudio Marrocco
    • 1
  • Paolo Simeone
    • 1
  • Francesco Tortorella
    • 1
  1. 1.DAEIMIUniversità degli Studi di CassinoCassinoItalia

Personalised recommendations