Using K-NN SVMs for Performance Improvement and Comparison to K-Highest Lagrange Multipliers Selection

  • Sedat Ozer
  • Chi Hau Chen
  • Imam Samil Yetik
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6218)

Abstract

Support Vector Machines (SVM) can perform very well on noise free data sets and can usually achieve good classification accuracies when the data is noisy. However, because of the overfitting problem, the accuracy decreases if the SVM is modeled improperly or if the data is excessively noisy or nonlinear. For SVM, most of the misclassification occurs when the test data lies closer to the decision boundary. Therefore in this paper, we investigate the effect of Support Vectors found by SVM, and their effect on the decision when used with the Gaussian kernel. Based on the discussion results we also propose a new technique to improve the performance of SVM by creating smaller clusters along the decision boundary in the higher dimensional space. In this way we reduce the overfitting problem that occurs because of the model selection or the noise effect. As an alternative SVM tuning method, we also propose using only K highest Lagrange multipliers to summarize the decision boundary instead of the whole support vectors and compare the performances. Thus with test results, we show that the number of Support Vectors can be decreased further by using only a fraction of the support vectors found at the training step as a post-processing method.

Keywords

Support Vector Machine KNN SVM Post-processing Support Vector Reduction 

References

  1. 1.
    El-Naqa, I., Yang, Y., Wernick, M.N., Galatsanos, N.P., Nishikawa, R.M.: A support vector machine approach for detection of microcalcifications. IEEE Trans. on Medical Imaging 21(12), 1552–1563 (2002)CrossRefGoogle Scholar
  2. 2.
    Artan, Y., Huang, X.: Combining multiple 2ν-SVM classifiers for tissue segmentation. In: Proc. of ISBI 2008, pp. 488–491 (2008)Google Scholar
  3. 3.
    Lucey, S.: Enforcing non-positive weights for stable support vector tracking. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2008, pp. 1–8 (2008)Google Scholar
  4. 4.
    Ozer, S., Haider, M.A., Langer, D.L., van der Kwast, T.H., Evans, A.J., Wernick, M.N., Trachtenberg, J., Yetik, I.S.: Prostate Cancer Localization with Multispectral MRI Based on Relevance Vector Machines. In: ISBI 2009, pp. 73–76 (2009)Google Scholar
  5. 5.
    Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, Chichester (1998) ISBN: 0-471-03003-1MATHGoogle Scholar
  6. 6.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)MATHGoogle Scholar
  7. 7.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)MATHCrossRefGoogle Scholar
  8. 8.
    Ming, T., Yi, Z., Songcan, C.: Improving support vector machine classifier by combining it with k nearest neighbor principle based on the best distance measurement. IEEE Intelligent Transportation Systems 1, 373–378 (2003)Google Scholar
  9. 9.
    De Coste, D., Mazzoni, D.: Fast query-optimized kernel machine classification via incremental approximate nearest support vectors. In: 20th International Conference on Machine, Learning - ICML, Washington, DC (2003)Google Scholar
  10. 10.
    Zhang, H., Berg, A.C., Maire, M., Malik, J.: Svm-knn: Discriminative nearest neighbor classification for visual category recognition. In: IEEE Conference on Computer Vision and Pattern Recognition CVPR (2006)Google Scholar
  11. 11.
    Zhang, L., Zhou, W., Jiao, L.: Wavelet Support Vector Machine. IEEE Trans. On Systems, Man, and Cybernetics-Part B: Cybernetics 34(1), 34–39 (2004)CrossRefGoogle Scholar
  12. 12.
    Ozer, S., Chen, C.H.: Generalized Chebyshev Kernels for Support Vector Classification. In: 19th International Conference on Pattern Recognition, ICPR (2008)Google Scholar
  13. 13.
    Asuncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine http://www.ics.uci.edu/~mlearn/MLRepository.html
  14. 14.
    Canu, S., Grandvalet, Y., Guigue, V., Rakotomamonjy, A.: SVM and Kernel Methods Matlab Toolbox, Perception Systèmes et Information, INSA de Rouen (2005) http://asi.insa-rouen.fr/enseignants/~arakotom/toolbox/index.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Sedat Ozer
    • 1
  • Chi Hau Chen
    • 2
  • Imam Samil Yetik
    • 3
  1. 1.Electrical & Computer Eng. DeptRutgers UniversityNew BrunswickUSA
  2. 2.Electrical & Computer Eng. DeptUniversity of Massachusetts, DartmouthN. DartmouthUSA
  3. 3.Electrical & Computer Eng. DeptIllinois Institute of TechnologyChicagoUSA

Personalised recommendations