Abstract
Even the support vector machine (SVM) has been proposed to provide a good generalization performance, the classification result of the practically implemented SVM is often far from the theoretically expected level because their implementations are based on the approximated algorithms due to the high complexity of time and space. To improve the limited classification performance of the real SVM, we propose to use the SVM ensembles with bagging (bootstrap aggregating). Each individual SVM is trained independently using the randomly chosen training samples via a bootstrap technique. Then, they are aggregated into to make a collective decision in several ways such as the majority voting, the LSE(least squares estimation)-based weighting, and the double-layer hierarchical combining. Various simulation results for the IRIS data classification and the hand-written digit recognitionshow that the proposed SVM ensembles with bagging outperforms a single SVM in terms of classification accuracy greatly.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Cortes, C., Vapnik, V.: Support vector network. Machine Learning. 20 (1995) 273–297
Burges, C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery. 2(2) (1998) 121–167
Joachims, T.: Making large-scale support vector machine learning practical. Advances in Kernel Methods: Support Vector Machines, MIT Press, Cambridge, MA (1999)
Platt, J.: Fast training of support vector machines using sequential minimal optimization. Advances in Kernel Methods: Support Vector Machines, MIT Press, Cambridge, MA (1999)
Weston, J., Watkins, C.: Support Vector Machines for Multi-Class Pattern Recognition. Proceedings of the 7th European Symposium on Artificial Neural Networks (1999)
Bottou, L., Cortes, C., Denker, J., Drucker, H., Guyon, I., Jackel, Lawrence D., LeCun, Y., Müller U., Säckinger E., Simard, P., Vapnik, V.: Comparison of classifier methods: a case study in handwriting digit recognition. Proceedings of the 13th International Conference on Pattern Recognition. IEEE Computer Society Press (1994) 77–87
Knerrm, S., Personnaz, L., Dreyfus, G.: Single-layer learning revisited: a stepwise procedure for building and training a neural network. In: Fogelman, J.(eds.): Neurocomputing: Algorithms, Architechtures and Application, Springer-Verlag (1990)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York, 1999
Schölkopf, B., Smola A., and Muller K.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5) (1998) 1299–1319
Dietterich, T.: Machine Learning Research: Four Current Directions. The AI Magazine, 18(4) (1998) 97–136
Hansen, L., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 12 (1990) 993–1001
Breiman, L.: Bagging predictors. Machine Learning, 24(2) (1996) 123–140, 1996
Kim, D., Kim, C.: Forecasting time series with genetic fuzzy predictor ensemble. IEEE Transaction on Fuzzy Systems, 5(4) (1997) 523–535
Jordan, M., Jacobs, R., Hierarchical mixtures of experts and the EM algorithm. Neural Computation 6(5) (1994) 181–214
Fisher, R.: The use of multiple measurements in taxonomic problems. Annual Eugenics, 7, Part II (1936) 179–188
Bay, B.: The UCI KDD Archive [http://kdd.ics.uci.edu]. Irvine, CA: University of California, Department of Information and Computer Science (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kim, HC., Pang, S., Je, HM., Kim, D., Bang, SY. (2002). Support Vector Machine Ensemble with Bagging. In: Lee, SW., Verri, A. (eds) Pattern Recognition with Support Vector Machines. SVM 2002. Lecture Notes in Computer Science, vol 2388. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45665-1_31
Download citation
DOI: https://doi.org/10.1007/3-540-45665-1_31
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44016-1
Online ISBN: 978-3-540-45665-0
eBook Packages: Springer Book Archive