Abstract
A method for multiple classifier selection and combination is presented. Classifiers are selected sequentially on-line based on a context specific (data driven) formulation of classifier optimality. A finite subset of a large (or infinite) set of classifiers is used for classification resulting not only in a computational saving, but a boost in classification performance. Experiments were carried out using single class binary classifiers on multi-class classification problems. Classifier outputs are combined using a Bayesian approach and results show a significant improvement in classification accuracy over the AdaBoost. MH method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Schapire, R.: The boosting approach to machine learning: An overview. In: Proc. MSRI Workshop on Nonlinear Estimation and Classification. (2002)
Schapire, R., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3) (1999) 297–336
Breiman, L.: Bagging predictors. Machine Learning 24 (1996) 123–140
Jordan, M.: Hierarchical mixtures of experts and the EM algorithm. Neural Computation 6 (1994) 181–214
Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proc. IEEE C.V.P.R. (2001) 1.511–1.518
Merz, C.: Dynamical selection of learning algorithms. In Fisher, D., Lenz, H., eds.: Learning from Data: Artificial Intelligence and Statistics, 5. (1996)
Woods, K., Kegelmeyer, W., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Trans. on P.A.M.I. 19 (1997) 405–410
Kuncheva, L.: ‘change-glasses’ approach in pattern recognition. Pattern Recognition Letters 14 (1993) 619–623
Parhami, B.: Voting algorithms. IEEE Trans. on Reliability 43(4) (1994) 617–629
Alkoot, F., Kitler, J.: Improving performance of the product fusion strategy. In: Proc. International Conference on Pattern Recognition. (2000)
Schiele, B.: How many classifiers do I need? In: Proc. I.C.P.R. (2002) 176–179
Merz, C.: Using correspondance analysis to combine classifiers. Machine Learning 0 (1997) 1–26
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting and variants. Machine Learning 36 (1999) 105–142
Dempster, A., N. Laird, D.R.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society. Series B 39 (1977) 1–38
Magee, D.: Tracking multiple vehicles using foreground, background and motion models. In: Proc. ECCV: Statistical Methods in Video Processing. (2002) 7–12
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Magee, D. (2003). A Sequential Scheduling Approach to Combining Multiple Object Classifiers Using Cross-Entropy. In: Windeatt, T., Roli, F. (eds) Multiple Classifier Systems. MCS 2003. Lecture Notes in Computer Science, vol 2709. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44938-8_14
Download citation
DOI: https://doi.org/10.1007/3-540-44938-8_14
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40369-2
Online ISBN: 978-3-540-44938-6
eBook Packages: Springer Book Archive