The Entire Regularization Path for the Support Vector Domain Description

  • Karl Sjöstrand
  • Rasmus Larsen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4190)


The support vector domain description is a one-class classification method that estimates the shape and extent of the distribution of a data set. This separates the data into outliers, outside the decision boundary, and inliers on the inside. The method bears close resemblance to the two-class support vector machine classifier. Recently, it was shown that the regularization path of the support vector machine is piecewise linear, and that the entire path can be computed efficiently. This paper shows that this property carries over to the support vector domain description. Using our results the solution to the one-class classification can be obtained for any amount of regularization with roughly the same computational complexity required to solve for a particularly value of the regularization parameter. The possibility of evaluating the results for any amount of regularization not only offers more accurate and reliable models, but also makes way for new applications. We illustrate the potential of the method by determining the order of inclusion in the model for a set of corpora callosa outlines.


Support Vector Machine Regularization Parameter Mahalanobis Distance Decision Boundary Regularization Path 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Tax, D.M., Duin, R.P.: Support vector domain description. Pattern Recognition Letters 20(11-13), 1191–1199 (1999)CrossRefGoogle Scholar
  2. 2.
    Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Computation 13, 1443–1471 (2001)zbMATHCrossRefGoogle Scholar
  3. 3.
    Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)zbMATHGoogle Scholar
  4. 4.
    Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. JMLR 5, 1391–1415 (2004)MathSciNetGoogle Scholar
  5. 5.
    Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of Statistics 32(2), 407–451 (2004)zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Rosset, S., Zhu, J.: Piecewise linear regularized solution paths. Technical report, Stanford University (2003)Google Scholar
  7. 7.
    Rosset, S.: Tracking curved regularized optimization solution paths. In: NIPS (2004)Google Scholar
  8. 8.
    Pantoni, L., Basile, A.M., Pracucci, G., Asplund, K., Bogousslavsky, J., Chabriat, H., Erkinjuntti, T., Fazekas, F., Ferro, J.M., Hennerici, M., O’brien, J., Scheltens, P., Visser, M.C., Wahlund, L.O., Waldemar, G., Wallin, A., Inzitari, D.: Impact of age-related cerebral white matter changes on the transition to disability - the LADIS study: Rationale, design and methodology. Neuroepidemiology 24(1-2), 51–62 (2005)CrossRefGoogle Scholar
  9. 9.
    Ben-Hur, A., Horn, D., Siegelmann, H.T., Vapnik, V.: Support vector clustering. Journal of Machine Learning Research 2, 125–137 (2001)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Karl Sjöstrand
    • 1
    • 2
  • Rasmus Larsen
    • 1
  1. 1.Informatics and Mathematical ModellingTechnical University of DenmarkDenmark
  2. 2.Department of Radiology, VAMCUniversity of California-San FranciscoUSA

Personalised recommendations