Constrained Learning Vector Quantization or Relaxed k-Separability
Neural networks and other sophisticated machine learning algorithms frequently miss simple solutions that can be discovered by a more constrained learning methods. Transition from a single neuron solving linearly separable problems, to multithreshold neuron solving k-separable problems, to neurons implementing prototypes solving q-separable problems, is investigated. Using Learning Vector Quantization (LVQ) approach this transition is presented as going from two prototypes defining a single hyperplane, to many co-linear prototypes defining parallel hyperplanes, to unconstrained prototypes defining Voronoi tessellation. For most datasets relaxing the co-linearity condition improves accuracy increasing complexity of the model, but for data with inherent logical structure LVQ algorithms with constraints significantly outperforms original LVQ and many other algorithms.
KeywordsHide Node Separable Problem Voronoi Tessellation Learn Vector Quantization Projection Pursuit
Unable to display preview. Download preview PDF.
- 4.Schölkopf, B., Smola, A.: Learning with Kernels. In: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2001)Google Scholar
- 8.Asuncion, A., Newman, D.: UCI machine learning repository (2007)Google Scholar
- 18.Hippe, Z.: Data mining in medical diagnosis. In: Kącki, E. Computers in Medicine, Łodź, Poland, vol. 1, pp. 25–34. Polish Society of Medical Informatics (1999)Google Scholar