Advertisement

A Multiple Kernel Support Vector Machine Scheme for Simultaneous Feature Selection and Rule-Based Classification

  • Zhenyu Chen
  • Jianping Li
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4426)

Abstract

In many applications such as bioinformatics and medical decision-making, the interpretability is important to make the model acceptable to the user and help the expert discover the novel and perhaps valuable knowledge hidden behind the data. This paper presents a novel feature selection and rule extraction method which is based on multiple kernel support vector machine (MK-SVM). This method has two outstanding properties. Firstly, the multiple kernels are described as the convex combination of the single feature basic kernels. It makes the feature selection problem in the context of SVM transformed into an ordinary multiple parameters learning problem. A 1-norm based linear programming is proposed to carry out the optimization of those parameters. Secondly, the rules are obtained in an easy way: only the support vectors necessary. It is demonstrated in theory that every support vector obtained by this method is just the vertex of the hypercube. Then a tree-like algorithm is proposed to extract the if-then rules. Three UCI datasets are used to demonstrate the effectiveness and efficiency of this approach.

Keywords

Support Vector Machine Support Vector Feature Selection Coverage Rate Rule Extraction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Liu, Y., Zheng, Y.F.: FS-SFS: a novel feature selection method for support vector machines. In: IEEE International Conference on Acoustics, Speech, Signal Processing, vol. 5, pp. 797–800. IEEE Computer Society Press, Los Alamitos (2004)Google Scholar
  2. 2.
    Mao, K.Z.: Feature subset selection for support vector machines though discriminate function pruning analysis. IEEE Transactions on SMC, part B 34, 60–67 (2004)CrossRefGoogle Scholar
  3. 3.
    Huang, C.L., Wei, C.J.: GA-based feature selection and parameters optimization for support vector machines. Expert Systems with applications 31, 231–240 (2006)CrossRefGoogle Scholar
  4. 4.
    He, J., Hu, H.J., Harrison, R., et al.: Rule generation for protein secondary structure prediction with support vector machines and decision tree. IEEE Transactions on nanobioscience 5, 46–53 (2006)CrossRefGoogle Scholar
  5. 5.
    Fung, G., Sandilya, S., Baharat, R.: Rule extraction from linear support vector machines. In: 11th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 32–40. ACM Press, New York (2005)Google Scholar
  6. 6.
    Micchelli, C.A., Pontil, M.: Learning the kernel function via regularization. Journal of Machine Learning Research 6, 1099–1125 (2005)MathSciNetGoogle Scholar
  7. 7.
    Chapelle, O., et al.: Choosing multiple parameters for support vector machines. Machine Learning 46, 131–159 (2002)zbMATHCrossRefGoogle Scholar
  8. 8.
    Taha, I.A., Ghosh, J.: Symbolic interpretation of artificial neural networks. IEEE Transactions on knowledge and data engineering 11, 448–463 (1999)CrossRefGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Zhenyu Chen
    • 1
    • 2
  • Jianping Li
    • 1
  1. 1.Institute of Policy & Management, Chinese Academy of Sciences, Beijing 100080China
  2. 2.Graduate University of Chinese Academy of Sciences, Beijing 100039China

Personalised recommendations