K-nearest neighbor graph (KNN) is a widely used tool in several pattern recognition applications but it has drawbacks. Firstly, the choice of k can have significant impact on the result because it has to be fixed beforehand, and it does not adapt to the local density of the neighborhood. Secondly, KNN does not guarantee connectivity of the graph. We introduce an alternative data structure called XNN, which has variable number of neighbors and guarantees connectivity. We demonstrate that the graph provides improvement over KNN in several applications including clustering, classification and data analysis.
KeywordsKNN Neighborhood graph Data modeling
- 3.Aoyama, K., Saito, K., Sawada, H., Ueda, N.: Fast approximate similarity search based on degree-reduced neighborhood graphs. In: ACM SIGKDD, San Diego, USA, pp. 1055–1063 (2011)Google Scholar
- 10.Rezafinddramana, O., Rayat, F., Venturin, G.: Incremental Delaunay triangulation construction for clustering. In: International Conference on Pattern Recognition, ICPR, Stockholm, Sweden, pp. 1354–1359 (2014)Google Scholar
- 12.Fischer, B., Buhmann, J.M.: Path-based clustering for grouping of smooth curves and texture segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 25, 513–518 (2003)Google Scholar
- 15.Gou, J., Du, L., Zhang, Y., Xiong, T.: A new distance-weighted k-nearest neighbor classifier. J. Inf. Comput. Sci. 9(6), 1429–1436 (2012)Google Scholar