Advertisement

Applied Intelligence

, Volume 48, Issue 2, pp 331–342 | Cite as

Research on parameter selection method for support vector machines

Article
  • 417 Downloads

Abstract

The kernel parameter and penalty parameter C are the main factors that affect the learning performance of the support vector machine. However, there are many deficiencies in the existing kernel parameters and penalty parameters C. These methods do not have high accuracy when it comes to classifying multi-category samples, and even ignore some of the samples to conduct training, which violates the integrity of the experimental data. In contrast, this paper improves the selection method of support vector machine kernel parameters and penalty parameters in two ways. First, it obtains the kernel parameter value by optimizing the maximum separation interval between the samples. Second, it optimizes the generalization ability estimation via the influence of the non-boundary support vector on the stability of the support vector machine. The method takes full account of all the training sample data, which is applicable to most sample types, and has the characteristics of low initialization requirements and high-test accuracy. The paper finally uses multiple sets of UCI sample data sets and facial image recognition to verify the method. The experimental results show that the method is feasible, effective and stable.

Keywords

Kernel parameter Penalty parameter Degree of separation Support vector machine 

References

  1. 1.
    Tanveer M, Mangal M, Ahmad I, Shao YH (2016) One norm linear programming support vector regression. Neurocomputing 173:1508–1518CrossRefGoogle Scholar
  2. 2.
    Khemchandani R, Pal A (2016) Multi-category laplacian least squares twin support vector machine. Appl Intell 45(2):458–474CrossRefGoogle Scholar
  3. 3.
    Shao Y-H et al (2013) Least squares twin parametric-margin support vector machine for classification. Appl Intell 39(3):451–464CrossRefGoogle Scholar
  4. 4.
    Tanveer M et al (2016) An efficient regularized k-nearest neighbor based weighted twin support vector regression. Knowl-Based Syst 94:70–87CrossRefGoogle Scholar
  5. 5.
    Nakagawa T, Iwahori Y, Bhuyan MK (2013) Defect classification of electronic board using multiple classifiers and grid search of SVM parameters. Computer and Information Science. Springer International Publishing, pp 115–127Google Scholar
  6. 6.
    Cortes BC, Vapnik V (2012) Support vector networks. Int J Mach Learn 20(3):273–297MATHGoogle Scholar
  7. 7.
    Wu K-P, Wang S-D (2009) Choosing the kernel parameters for suppory vector machines by the inter-cluster distance in the space. Pattern Recogn 42:710–717CrossRefMATHGoogle Scholar
  8. 8.
    Dong C-X, Xian R, Yang S-Q et al (2004) Support vector machine (SVM) parameters selection method research. J Syst Eng Electron 26(8):1117–1120Google Scholar
  9. 9.
    Liu Q, Chen C, Zhang Y et al (2011) Feature selection for support vector machines with RBF kernel. Artif Intell Rev 36(2):99–115CrossRefGoogle Scholar
  10. 10.
    Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297MATHGoogle Scholar
  11. 11.
    Khan NM, Ksantini R, Ahmad IS et al (2014) Covariance-guided one-class support vector machine. Pattern Recogn 47(6):2165–2177CrossRefGoogle Scholar
  12. 12.
    Shao YH, Chen WJ, Deng NY (2014) Nonparallel hyperplane support vector machine for binary classification problems. Inf Sci 263(3):22–35MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Nasiri JA, Charkari NM, Jalili S (2015) Least squares twin multi-class classification support vector machine. Pattern Recogn 48(3):984–992CrossRefMATHGoogle Scholar
  14. 14.
    Doran G, Ray S (2014) A theoretical and empirical analysis of support vector machine methods for multiple-instance classification. Mach Learn 97(1–2):79–102MathSciNetCrossRefMATHGoogle Scholar
  15. 15.
    Schölkopf B, Smola AJ (2002) Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT, CambridgeGoogle Scholar
  16. 16.
    Adankon MM, Cheriet M (2015) Support vector machine. Encycl Biom 1504–1511Google Scholar
  17. 17.
    Rongali S, Yalavarthi R (2016) Parameter optimization of support vector machine by improved ant colony optimization. In: Proceedings of the second international conference on computer and communication technologies. Springer, India, pp 671–678Google Scholar
  18. 18.
    Rakotomamonjy A, Flamary R, Yger F (2013) Learning with infinitely many features. Mach Learn 91(91):43–66MathSciNetCrossRefMATHGoogle Scholar
  19. 19.
    Fernández-Navarro F, Hervás-Martínez C, Gutiérrez PA et al (2012) Parameter estimation of q-Gaussian radial basis functions neural networks with a hybrid algorithm for binary classification. Neurocomputing 75(1):123–134CrossRefGoogle Scholar
  20. 20.
    Cohen DA, Fernández EA (2012) SVMTOCP: a binary tree base SVM approach through optimal multi-class Binarization. In: Iberoamerican congress on pattern recognition. Springer, Berlin, pp 472–478Google Scholar
  21. 21.
    Abe S (2015) Fuzzy support vector machines for multilabelclassification. Pattern Recogn 48(6):2110–2117CrossRefMATHGoogle Scholar
  22. 22.
    Musavi MT, Ahmed W, Chan KH et al (1992) On the training of radial basis function classifiers. Neural Netw 5(4):595–603CrossRefGoogle Scholar
  23. 23.
    Seetha H, Saravanan R (2011) On improving the generalization of SVM classifier. Computer Networks and Intelligent Computing. Springer, Berlin, pp 11–20Google Scholar
  24. 24.
    Phan AV, Le Nguyen M, Bui LT (2016) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 1–15Google Scholar
  25. 25.
    Li Y, Tian X, Song M et al (2015) Multi-task proximal support vector machine. Pattern Recogn 48(10):3249–3257CrossRefGoogle Scholar
  26. 26.
    Ding S, Qi B (2012) Research of granular support vector machine. Artif Intell Rev 38(1):1–7CrossRefGoogle Scholar
  27. 27.
    Cheung NJ, Ding XM, Shen HB (2015) A supervised particle swarm algorithm for real-parameter optimization. Appl Intell 43(4):825–839CrossRefGoogle Scholar
  28. 28.
    Phan A V, Nguyen M L, Bui L T (2016) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 1–15Google Scholar
  29. 29.
    Sarojini B, Ramaraj N, Nickolas S (2009) Enhancing the performance of LIBSVM classifier by kernel f-score feature selection. In: International conference on contemporary computing. Springer, Berlin, pp 533–543Google Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.School of Computer Science and TechnologyHanghzouDianzi UniversityHangzhouChina

Personalised recommendations