A New Weighted k-Nearest Neighbor Algorithm Based on Newton’s Gravitational Force

  • Juan AguileraEmail author
  • Luis C. González
  • Manuel Montes-y-Gómez
  • Paolo Rosso
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11401)


The kNN algorithm has three main advantages that make it appealing to the community: it is easy to understand, it regularly offers competitive performance and its structure can be easily tuning to adapting to the needs of researchers to achieve better results. One of the variations is weighting the instances based on their distance. In this paper we propose a weighting based on the Newton’s gravitational force, so that a mass (or relevance) has to be assigned to each instance. We evaluated this idea in the kNN context over 13 benchmark data sets used for binary and multi-class classification experiments. Results in \(\mathrm {F}_1\) score, statistically validated, suggest that our proposal outperforms the original version of kNN and is statistically competitive with the distance weighted kNN version as well.



This research was partially supported by CONACYT-Mexico (project FC-2410). The work of Paolo Rosso has been partially funded by the SomEMBED TIN2015-71147-C2-1-P MINECO research project.


  1. 1.
    Benavoli, A., Mangili, F., Corani, G., Zaffalon, M., Ruggeri, F.: A Bayesian Wilcoxon signed-rank test based on the Dirichlet process. In: Proceedings of the 31st International Conference on Machine Learning, vol. 32, p. 9 (2014)Google Scholar
  2. 2.
    Bhattacharya, G., Ghosh, K., Chowdhury, A.S.: An affinity-based new local distance function and similarity measure for kNN algorithm. Pattern Recogn. Lett. 33(3), 356–363 (2012)CrossRefGoogle Scholar
  3. 3.
    Carrasco, J., García, S., del Mar Rueda, M., Herrera, F.: rNPBST: an R package covering non-parametric and bayesian statistical tests. In: Martínez de Pisón, F.J., Urraca, R., Quintián, H., Corchado, E. (eds.) HAIS 2017. LNCS (LNAI), vol. 10334, pp. 281–292. Springer, Cham (2017). Scholar
  4. 4.
    Domingos, P.: The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World. Basic Books, New York City (2015)Google Scholar
  5. 5.
    Dudani, S.A.: The distance-weighted k-nearest-neighbor rule. IEEE Trans. Syst. Man Cybern. SMC 6(4), 325–327 (1976)CrossRefGoogle Scholar
  6. 6.
    Guru, D.S., Sharath, Y.H., Manjunath, S.: Texture features and KNN in classification of flower images. Int. J. Comput. Appl. 1, 21–29 (2010)Google Scholar
  7. 7.
    Lam, S.K., Riedl, J.: Shilling recommender systems for fun and profit. In: Proceedings of the 13th International Conference on World Wide Web - WWW 2004, p. 393 (2004)Google Scholar
  8. 8.
    López, J., Maldonado, S.: Redefining nearest neighbor classification in high-dimensional settings. Pattern Recogn. Lett. 110, 36–43 (2018)CrossRefGoogle Scholar
  9. 9.
    Mateos-García, D., García-Gutiérrez, J., Riquelme-Santos, J.C.: An evolutionary voting for k-nearest neighbours. Expert Syst. Appl. 43, 9–14 (2016)CrossRefGoogle Scholar
  10. 10.
    Parvinnia, E., Sabeti, M., Jahromi, M.Z., Boostani, R.: Classification of EEG Signals using adaptive weighted distance nearest neighbor algorithm. J. King Saud Univ. - Comput. Inf. Sci. 26(1), 1–6 (2014)Google Scholar
  11. 11.
    Soucy, P., Mineau, G.: A simple KNN algorithm for text categorization. In: Proceedings 2001 IEEE International Conference on Data Mining, pp. 647–648 (2001)Google Scholar
  12. 12.
    Tan, S.: Neighbor-weighted K-nearest neighbor for unbalanced text corpus. Expert Syst. Appl. 28(4), 667–671 (2005)CrossRefGoogle Scholar
  13. 13.
    Wilson, D.L.: Asymptotic properties of nearest neighbor rules using edited data. IEEE Trans. Syst. Man Cybern. 2(3), 408–421 (1972)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Wu, X., et al.: Top 10 algorithms in data mining. Knowl. Inf. Syst. 14, 1–37 (2008)CrossRefGoogle Scholar
  15. 15.
    Xu, Y., Zhu, Q., Fan, Z., Qiu, M., Chen, Y., Liu, H.: Coarse to fine K nearest neighbor classifier. Pattern Recogn. Lett. 34(9), 980–986 (2013)CrossRefGoogle Scholar
  16. 16.
    Zhang, S., Cheng, D., Deng, Z., Zong, M., Deng, X.: A novel kNN algorithm with data-driven k parameter computation. Pattern Recogn. Lett. 0, 1–11 (2017)Google Scholar
  17. 17.
    Zhu, Q., Feng, J., Huang, J.: Natural neighbor: a self-adaptive neighborhood method without parameter K. Pattern Recogn. Lett. 80, 30–36 (2016)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Juan Aguilera
    • 1
    Email author
  • Luis C. González
    • 1
  • Manuel Montes-y-Gómez
    • 2
  • Paolo Rosso
    • 3
  1. 1.Universidad Autónoma de ChihuahuaChihuahuaMexico
  2. 2.Instituto Nacional de Astrofísica, Óptica y ElectrónicaPueblaMexico
  3. 3.Universitat Politècnica de ValènciaValenciaSpain

Personalised recommendations