Identifying Non-pulsar Radiation and Predicting Chess Endgame Result Using ARSkNN

  • Yash Agarwal
  • Ashish KumarEmail author
  • Roheet Bhatnagar
  • Sumit Srivastava
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 141)


We are currently living in a data age. Due to the expansion of Internet of Things platform, there has been an upsurge in the number of devices connected to the Internet. Every device, ranging from smart sensors and smart phones to systems installed in manufacturing units, hospitals and vehicles is generating data. Such developments have not only escalated the generation of data but also created a need for analysis of raw data to identify patterns. Thus, data mining techniques are being deployed extensively to extract information. The accuracy and effectiveness of data mining techniques in providing better outcomes and cost-effective methods in various domains has already been established. Usually, in supervised learning, distance estimation is used by instance-based learning classifiers like kNN. In this analysis, the regular kNN classifier has been compared with ARSkNN which instead of following the conventional procedure of distance estimation uses the mass estimation approach. ARSkNN has been proved to be commensurate (or superior) to kNN in accuracy and has been found to reduce the computation time drastically on datasets chosen for this analysis.


Data mining Classification Nearest neighbors ARSkNN 


  1. 1.
    Aha, D.W.: Editorial. In: Lazy Learning, pp. 7–10. Springer, Berlin (1997)Google Scholar
  2. 2.
    Alpaydin, E.: Voting over multiple condensed nearest neighbors. In: Lazy Learning, pp. 115–132. Springer, Berlin, (1997)Google Scholar
  3. 3.
    Archana, S., Elangovan, K.: Survey of classification techniques in data mining. Int. J. Comput. Sci. Mob. Appl. 2(2), 65–71 (2014)Google Scholar
  4. 4.
    Audibert, J.Y., Tsybakov, A.B.: Fast learning rates for plug-in classifiers under the margin condition. arXiv preprint math/0507180 (2005)Google Scholar
  5. 5.
    Bailey, T., Jain, A.: A note on distance-weighted \( k \)-nearest neighbor rules. IEEE Trans. Syst., Man, Cybern. 4, 311–313 (1978)zbMATHGoogle Scholar
  6. 6.
    Baoli, L., Qin, L., Shiwen, Y.: An adaptive k-nearest neighbor text categorization strategy. ACM Trans. Asian Lang. Inf. Process. (TALIP) 3(4), 215–226 (2004)CrossRefGoogle Scholar
  7. 7.
    Bauer, M.E., Burk, T.E., Ek, A.R., Coppin, P.R., Lime, S.D., Walsh, T.A., Walters, D.K., Befort, W., Heinzen, D.F.: Satellite inventory of Minnesota forest resources (1993)Google Scholar
  8. 8.
    Bax, E.: Validation of nearest neighbor classifiers. IEEE Trans. Inf. Theory 46(7), 2746–2752 (2000)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Chen, Y.S., Hung, Y.P., Yen, T.F., Fuh, C.S.: Fast and versatile algorithm for nearest neighbor search based on a lower bound tree. Pattern Recognit. 40(2), 360–375 (2007)CrossRefGoogle Scholar
  10. 10.
    Fix, E., Hodges Jr., J.L.: Discriminatory analysis-nonparametric discrimination: consistency properties. Technical Report, DTIC Document (1951)Google Scholar
  11. 11.
    Gao, Q.B., Wang, Z.Z.: Center-based nearest neighbor classifier. Pattern Recognit. 40(1), 346–349 (2007)CrossRefGoogle Scholar
  12. 12.
    Gates, G.: The reduced nearest neighbor rule (Corresp.). IEEE Trans. Inf. Theory 18(3), 431–433 (1972)Google Scholar
  13. 13.
    Hart, P.: The condensed nearest neighbor rule (Corresp.). IEEE Trans. Inf. Theory 14(3), 515–516 (1968)Google Scholar
  14. 14.
    Kumar, A., Bhatnagar, R., Srivastava, S.: ARSkNN-A k-NN classifier using mass based similarity measure. Procedia Comput. Sci. 46, 457–462 (2015)CrossRefGoogle Scholar
  15. 15.
    Lorimer, D.R., Kramer, M.: Handbook of Pulsar Astronomy, vol. 4. Cambridge University Press, Cambridge (2005)Google Scholar
  16. 16.
    Lyon, R., Stappers, B., Cooper, S., Brooke, J., Knowles, J.: Fifty years of pulsar candidate selection: from simple filters to a new principled real-time classification approach. Mon. Not. R. Astron. Soc. 459(1), 1104–1123 (2016)CrossRefGoogle Scholar
  17. 17.
    Omercevic, D., Drbohlav, O., Leonardis, A.: High-dimensional feature matching: employing the concept of meaningful nearest neighbors. In: IEEE 11th International Conference on Computer Vision, ICCV 2007, pp. 1–8. IEEE (2007)Google Scholar
  18. 18.
    Shalev-Shwartz, S., Singer, Y., Ng, A.Y.: Online and batch learning of pseudo-metrics. In: Proceedings of the 21st International Conference On Machine Learning, p. 94. ACM (2004)Google Scholar
  19. 19.
    Ting, K.M., Zhou, G.T., Liu, F.T., Tan, J.S.C.: Mass estimation and its applications. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 989–998. ACM (2010)Google Scholar
  20. 20.
    Toyama, J., Kudo, M., Imai, H.: Probably correct k-nearest neighbor search in high dimensions. Pattern Recognit. 43(4), 1361–1372 (2010)CrossRefGoogle Scholar
  21. 21.
    Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10, 207–244 (2009)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Yash Agarwal
    • 1
  • Ashish Kumar
    • 2
    Email author
  • Roheet Bhatnagar
    • 2
  • Sumit Srivastava
    • 2
  1. 1.Department of ECEManipal Institute of Technology, Manipal UniversityManipalIndia
  2. 2.Department of CSEManipal University JaipurJaipurIndia

Personalised recommendations