Adaptive, Hubness-Aware Nearest Neighbour Classifier with Application to Hyperspectral Data
Abstract
We present an extension of the Nearest Neighbour classifier that can adapt to sample imbalances in local regions of the dataset. Our approach uses the hubness statistic as a measure of a relation between new samples and the existing training set. This allows to estimate the upper limit of neighbours that vote for the label of the new instance. This estimation improves the classifier performance in situations where some classes are locally under-represented. The main focus of our method is to solve the problem of local undersampling that exists in hyperspectral data classification. Using several well-known Machine Learning and hyperspectral datasets, we show that our approach outperforms standard and distance-weighted kNN, especially for high values of k.
Keywords
Nearest neighbour Hyperspectral classification HubnessNotes
Acknowledgments
This work has been supported by the project ‘Representation of dynamic 3D scenes using the Atomic Shapes Network model’ financed by National Science Centre, decision DEC-2011/03/D/ST6/03753. Authors would like to thank Marcin Blachnik for extended discussion on the first version of the paper and Krisztian Buza for his insightful comments and for making available the PyHubs (http://www.biointelligence.hu/pyhubs.) library.
References
- 1.Fix, E., Hodges, J.L., Jr.: Discriminatory analysis-nonparametric discrimination: consistency properties. Technical Report, DTIC Document (1951)Google Scholar
- 2.Ding, H., Trajcevski, G., Scheuermann, P., Wang, X., Keogh, E.: Querying and mining of time series data: experimental comparison of representations and distance measures. Proc. VLDB Endow. 1(2), 1542–1552 (2008)CrossRefGoogle Scholar
- 3.Romaszewski, M., Głomb, P., Cholewa, M.: Semi-supervised hyperspectral classification from a small number of training samples using a co-training approach. ISPRS J. Photogramm. Remote Sens. 121, 60–76 (2016)CrossRefGoogle Scholar
- 4.Ghosh, A.K.: On optimum choice of k in nearest neighbor classification. Comput. Stat. Data Anal. 50(11), 3113–3123 (2006)MathSciNetCrossRefGoogle Scholar
- 5.Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefGoogle Scholar
- 6.Devroye, L., Gyorfi, L., Krzyzak, A., Lugosi, G.: On the strong universal consistency of nearest neighbor regression function estimates. Ann. Stat. 1371–1385 (1994)MathSciNetCrossRefGoogle Scholar
- 7.Ouyang, D., Li, D., Li, Q.: Cross-validation and non-parametric k nearest-neighbour estimation. Econom. J. 9(3), 448–471 (2006)MathSciNetCrossRefGoogle Scholar
- 8.Buza, K., Nanopoulos, A., Schmidt-Thieme, L.: Time-series classification based on individualised error prediction. In: 2010 IEEE 13th International Conference on Computational Science and Engineering (CSE), pp. 48–54. IEEE (2010)Google Scholar
- 9.Tomašev, N., Mladenić, D.: Nearest neighbor voting in high-dimensional data: Learning from past occurrences. In: 2011 IEEE 11th International Conference on Data Mining Workshops (ICDMW), pp. 1215–1218. IEEE (2011)Google Scholar
- 10.Tomašev, N., Buza, K.: Hubness-aware kNN classification of high-dimensional data in presence of label noise. Neurocomputing 160, 157–172 (2015)CrossRefGoogle Scholar
- 11.Bhattacharya, G., Ghosh, K., Chowdhury, A.S.: Test point specific k estimation for kNN classifier. In: 2014 22nd International Conference on Pattern Recognition (ICPR), pp. 1478–1483. IEEE (2014)Google Scholar
- 12.Bioucas-Dias, J.M., Plaza, A., Camps-Valls, G., Scheunders, P., Nasrabadi, N.M., Chanussot, J.: Hyperspectral remote sensing data analysis and future challenges. IEEE Geosci. Remote Sens. Mag. 1(2), 6–36 (2013)CrossRefGoogle Scholar
- 13.Li, J., Reddy Marpu, P., Plaza, A., Bioucas-Dias, J.M., Atli Benediktsson, J.: Generalized composite kernel framework for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 51(9), 4816–4829 (2013)CrossRefGoogle Scholar
- 14.Melgani, F., Bruzzone, L.: Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 42(8), 1778–1790 (2004)CrossRefGoogle Scholar
- 15.Tomašev, N., Buza, K., Marussy, K., Kis, P.B.: Hubness-aware classification, instance selection and feature construction: survey and extensions to time-series. In: Stańczyk, U., Jain, L.C. (eds.) Feature Selection for Data and Pattern Recognition. SCI, vol. 584, pp. 231–262. Springer, Heidelberg (2015). https://doi.org/10.1007/978-3-662-45620-0_11CrossRefGoogle Scholar
- 16.Biau, G., Devroye, L.: Weighted k-nearest neighbor density estimates. Lectures on the Nearest Neighbor Method. SSDS, pp. 43–51. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25388-6_5CrossRefzbMATHGoogle Scholar