K Nearest Neighbor Classification with Local Induction of the Simple Value Difference Metric
The classical k nearest neighbor (k-nn) classification assumes that a fixed global metric is defined and searching for nearest neighbors is always based on this global metric. In the paper we present a model with local induction of a metric. Any test object induces a local metric from the neighborhood of this object and selects k nearest neighbors according to this locally induced metric. To induce both the global and the local metric we use the weighted Simple Value Difference Metric (SVDM). The experimental results show that the proposed classification model with local induction of a metric reduces classification error up to several times in comparison to the classical k-nn method.
Unable to display preview. Download preview PDF.
- 4.Skowron, A., Stepaniuk, J.: Information granules and rough-neural computing. In: Rough-Neural Computing: Techniques for Computing with Words. Cognitive Technologies, pp. 43–84. Springer, Heidelberg (2003)Google Scholar
- 9.Friedman, J.: Flexible metric nearest neighbor classification. Technical Report 113, Department of Statistics, Stanford University, CA (1994)Google Scholar
- 11.Domeniconi, C., Gunopulos, D.: Efficient local flexible nearest neighbor classification. In: Proceedings of the Second SIAM International Conference on Data Mining (2002)Google Scholar
- 12.Domingos, P.: Unifying instance-based and rule-based induction. Machine Learning 24, 141–168 (1996)Google Scholar