Adaptive, Hubness-Aware Nearest Neighbour Classifier with Application to Hyperspectral Data

  • Michał RomaszewskiEmail author
  • Przemysław Głomb
  • Michał Cholewa
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 935)


We present an extension of the Nearest Neighbour classifier that can adapt to sample imbalances in local regions of the dataset. Our approach uses the hubness statistic as a measure of a relation between new samples and the existing training set. This allows to estimate the upper limit of neighbours that vote for the label of the new instance. This estimation improves the classifier performance in situations where some classes are locally under-represented. The main focus of our method is to solve the problem of local undersampling that exists in hyperspectral data classification. Using several well-known Machine Learning and hyperspectral datasets, we show that our approach outperforms standard and distance-weighted kNN, especially for high values of k.


Nearest neighbour Hyperspectral classification Hubness 



This work has been supported by the project ‘Representation of dynamic 3D scenes using the Atomic Shapes Network model’ financed by National Science Centre, decision DEC-2011/03/D/ST6/03753. Authors would like to thank Marcin Blachnik for extended discussion on the first version of the paper and Krisztian Buza for his insightful comments and for making available the PyHubs ( library.


  1. 1.
    Fix, E., Hodges, J.L., Jr.: Discriminatory analysis-nonparametric discrimination: consistency properties. Technical Report, DTIC Document (1951)Google Scholar
  2. 2.
    Ding, H., Trajcevski, G., Scheuermann, P., Wang, X., Keogh, E.: Querying and mining of time series data: experimental comparison of representations and distance measures. Proc. VLDB Endow. 1(2), 1542–1552 (2008)CrossRefGoogle Scholar
  3. 3.
    Romaszewski, M., Głomb, P., Cholewa, M.: Semi-supervised hyperspectral classification from a small number of training samples using a co-training approach. ISPRS J. Photogramm. Remote Sens. 121, 60–76 (2016)CrossRefGoogle Scholar
  4. 4.
    Ghosh, A.K.: On optimum choice of k in nearest neighbor classification. Comput. Stat. Data Anal. 50(11), 3113–3123 (2006)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefGoogle Scholar
  6. 6.
    Devroye, L., Gyorfi, L., Krzyzak, A., Lugosi, G.: On the strong universal consistency of nearest neighbor regression function estimates. Ann. Stat. 1371–1385 (1994)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Ouyang, D., Li, D., Li, Q.: Cross-validation and non-parametric k nearest-neighbour estimation. Econom. J. 9(3), 448–471 (2006)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Buza, K., Nanopoulos, A., Schmidt-Thieme, L.: Time-series classification based on individualised error prediction. In: 2010 IEEE 13th International Conference on Computational Science and Engineering (CSE), pp. 48–54. IEEE (2010)Google Scholar
  9. 9.
    Tomašev, N., Mladenić, D.: Nearest neighbor voting in high-dimensional data: Learning from past occurrences. In: 2011 IEEE 11th International Conference on Data Mining Workshops (ICDMW), pp. 1215–1218. IEEE (2011)Google Scholar
  10. 10.
    Tomašev, N., Buza, K.: Hubness-aware kNN classification of high-dimensional data in presence of label noise. Neurocomputing 160, 157–172 (2015)CrossRefGoogle Scholar
  11. 11.
    Bhattacharya, G., Ghosh, K., Chowdhury, A.S.: Test point specific k estimation for kNN classifier. In: 2014 22nd International Conference on Pattern Recognition (ICPR), pp. 1478–1483. IEEE (2014)Google Scholar
  12. 12.
    Bioucas-Dias, J.M., Plaza, A., Camps-Valls, G., Scheunders, P., Nasrabadi, N.M., Chanussot, J.: Hyperspectral remote sensing data analysis and future challenges. IEEE Geosci. Remote Sens. Mag. 1(2), 6–36 (2013)CrossRefGoogle Scholar
  13. 13.
    Li, J., Reddy Marpu, P., Plaza, A., Bioucas-Dias, J.M., Atli Benediktsson, J.: Generalized composite kernel framework for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 51(9), 4816–4829 (2013)CrossRefGoogle Scholar
  14. 14.
    Melgani, F., Bruzzone, L.: Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 42(8), 1778–1790 (2004)CrossRefGoogle Scholar
  15. 15.
    Tomašev, N., Buza, K., Marussy, K., Kis, P.B.: Hubness-aware classification, instance selection and feature construction: survey and extensions to time-series. In: Stańczyk, U., Jain, L.C. (eds.) Feature Selection for Data and Pattern Recognition. SCI, vol. 584, pp. 231–262. Springer, Heidelberg (2015). Scholar
  16. 16.
    Biau, G., Devroye, L.: Weighted k-nearest neighbor density estimates. Lectures on the Nearest Neighbor Method. SSDS, pp. 43–51. Springer, Cham (2015). Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Michał Romaszewski
    • 1
    Email author
  • Przemysław Głomb
    • 1
  • Michał Cholewa
    • 1
  1. 1.Institute of Theoretical and Applied InformaticsPolish Academy of SciencesGliwicePoland

Personalised recommendations