The Support Vector Machine
- 360 Downloads
Three separate sections comprise this chapter. The first presents an overview of statistical learning theory (SLT) as applied to machine learning. The topics covered are empirical or true risk minimization, the risk minimization principle (RMP), theoretical concept of risk minimization, function f0(X) that minimizes the expected (or true) risk, asymptotic consistency or uniform convergence, an example of the generalized bound for binary classification and finally, how are learning machines formed.
Linear separable systems
Linear non-separable systems
And non-linear, non-separable systems
It then introduces the topic of kernels, what they are, and how they might be chosen. A brief pointer is provided to the SVM literature available on the web.
These sections are followed by a sketch of how the SVM may be hybridized with the GA for feature subset selection and points the way to the value of further hybridization with an ensemble approach, the topic of the next chapter.
KeywordsSupport vector machine Statistical learning theory Empirical risk minimization VC dimension
Empirical risk minimization
Radial basis function
Risk minimization principle
Statistical learning theory
Sequential minimization optimization
Structured risk minimization
Support vector machine
- Azerman A, Bowerna EM (1964) Theoretical formulation of the potential function method in pattern recognition. Autom Remote Control, (Automat I Telemekh) 25:917–932Google Scholar
- Vapnik VN, Chervonenkis AY (1974) Teoriya raspoznavaniya obrazov: Statisticheskie problemy obucheniya. (Russian) [Theory of pattern recognition: Statistical problems of learning]. Moscow: Nauka.Google Scholar