Multi-class Leveraged κ-NN for Image Classification
The k-nearest neighbors (k-NN) classification rule is still an essential tool for computer vision applications, such as scene recognition. However, k-NN still features some major drawbacks, which mainly reside in the uniform voting among the nearest prototypes in the feature space.
In this paper, we propose a new method that is able to learn the “relevance” of prototypes, thus classifying test data using a weighted k-NN rule. In particular, our algorithm, called Multi-class Leveraged k-nearest neighbor (MLNN), learns the prototype weights in a boosting framework, by minimizing a surrogate exponential risk over training data. We propose two main contributions for improving computational speed and accuracy. On the one hand, we implement learning in an inherently multiclass way, thus providing significant computation time reduction over one-versus-all approaches. Furthermore, the leveraging weights enable effective data selection, thus reducing the cost of k-NN search at classification time. On the other hand, we propose a kernel generalization of our approach to take into account real-valued similarities between data in the feature space, thus enabling more accurate estimation of the local class density.
We tested MLNN on three datasets of natural images. Results show that MLNN significantly outperforms classic k-NN and weighted k-NN voting. Furthermore, using an adaptive Gaussian kernel provides significant performance improvement. Finally, the best results are obtained when using MLNN with an appropriate learned metric distance.
KeywordsVote Rule Scene Recognition Spatial Pyramid Match Prototype Selection Surrogate Risk
Unable to display preview. Download preview PDF.
- 2.Zhang, H., Berg, A.C., Maire, M., Malik, J.: Svm-knn: Discriminative nearest neighbor classification for visual category recognition. In: CVPR 2006, pp. 2126–2136 (2006)Google Scholar
- 3.Boiman, O., Shechtman, E., Irani, M.: In defense of nearest-neighbor based image classification. In: CVPR 2008, pp. 1–8 (2008)Google Scholar
- 6.Davis, J.V., Kulis, B., Jain, P., Sra, S., Dhillon, I.S.: Information-theoretic metric learning. In: ICML 2007, pp. 209–216 (2007)Google Scholar
- 10.Marin, J.M., Robert, C.P., Titterington, D.M.: A Bayesian reassessment of nearest-neighbor classification. J. of the Am. Stat. Assoc. (2009)Google Scholar
- 11.Athitsos, V., Sclaroff, S.: Boosting nearest neighbor classi.ers for multiclass recognition. In: CVPR 2005: Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005) - Workshops, vol. 45 (2005)Google Scholar
- 17.Fei-Fei, L., Perona, P.: A bayesian hierarchical model for learning natural scene categories. In: CVPR, pp. 524–531 (2005)Google Scholar
- 18.Lazebnik, S., Schmid, C., Ponce, J.: Beyond bags of features: Spatial pyramid matching for recognizing natural scene categories. In: CVPR, pp. 2169–2178 (2006)Google Scholar
- 19.Philbin, J., Chum, O., Isard, M., Sivic, J., Zisserman, A.: Lost in quantization: Improving particular object retrieval in large scale image databases. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2008)Google Scholar