Skip to main content

Adaptive k-Nearest-Neighbor Classification Using a Dynamic Number of Nearest Neighbors

  • Conference paper
Advances in Databases and Information Systems (ADBIS 2007)

Abstract

Classification based on k-nearest neighbors (kNN classification) is one of the most widely used classification methods. The number k of nearest neighbors used for achieving a high accuracy in classification is given in advance and is highly dependent on the data set used. If the size of data set is large, the sequential or binary search of NNs is inapplicable due to the increased computational costs. Therefore, indexing schemes are frequently used to speed-up the classification process. If the required number of nearest neighbors is high, the use of an index may not be adequate to achieve high performance. In this paper, we demonstrate that the execution of the nearest neighbor search algorithm can be interrupted if some criteria are satisfied. This way, a decision can be made without the computation of all k nearest neighbors of a new object. Three different heuristics are studied towards enhancing the nearest neighbor algorithm with an early-break capability. These heuristics aim at: (i) reducing computation and I/O costs as much as possible, and (ii) maintaining classification accuracy at a high level. Experimental results based on real-life data sets illustrate the applicability of the proposed method in achieving better performance than existing methods.

Work partially supported by the 2004-2006 Greek-Slovenian bilateral research program, funded by the General Secretariat of Research and Technology, Ministry of Development, Greece.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aha, D.W.: Editorial. Artificial Intelligence Review (Special Issue on Lazy Learning) 11(1-5), 1–6 (1997)

    Google Scholar 

  2. Atkeson, C., Moore, A., Schaal, S.: Locally weighted learning. Artificial Intelligence Review 11(1-5), 11–73 (1997)

    Article  Google Scholar 

  3. Atkeson, C., Schaal, S.: Memory-based neural networks for robot learning. Neurocomputing 9, 243–269 (1995)

    Article  Google Scholar 

  4. Beckmann, N., Kriegel, H.-P., Schneider, R., Seeger, B.: The r*-tree: An efficient and robust access method for points and rectangles. In: Proceedings of the ACM SIGMOD Conference, pp. 590–601. ACM Press, New York (1990)

    Google Scholar 

  5. Boehm, C., Krebs, F.: The k-nearest neighbour join: Turbo charging the kdd process. Knowledge and Information Systems 6(6), 728–749 (2004)

    Article  Google Scholar 

  6. Cheung, K.L., Fu, A.: Enhanced nearest neighbour search on the r-tree. ACM SIGMOD Record 27(3), 16–21 (1998)

    Article  Google Scholar 

  7. Dasarathy, B.V.: Nearest Neighbor Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)

    Google Scholar 

  8. Frey, P.W., Slate, D.J.: Letter recognition using holland-style adaptive classifiers. Machine Learning 6(2), 161–182 (1991)

    Google Scholar 

  9. Guttman, A.: R-trees: A dynamic index structure for special searching. In: Proceedings of the ACM SIGMOD Conference, pp. 47–57. ACM Press, New York (1984)

    Google Scholar 

  10. Han, J., Kamber, M.: Data Mining: Concepts and Techniques. Morgan Kaufmann, San Francisco (2000)

    Google Scholar 

  11. Hjaltason, G.R., Samet, H.: Distance browsing in spatial databases. ACM Transactions on Database Systems 24(2), 265–318 (1999)

    Article  Google Scholar 

  12. James, M.: Classification Algorithms. John Wiley & Sons, Chichester (1985)

    MATH  Google Scholar 

  13. Rousopoulos, N., Kelley, S., Vincent, F.: Nearest neigbor queries. In: Proceedings of the ACM SIGMOD Conference, pp. 71–79. ACM Press, New York (1995)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Yannis Ioannidis Boris Novikov Boris Rachev

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ougiaroglou, S., Nanopoulos, A., Papadopoulos, A.N., Manolopoulos, Y., Welzer-Druzovec, T. (2007). Adaptive k-Nearest-Neighbor Classification Using a Dynamic Number of Nearest Neighbors. In: Ioannidis, Y., Novikov, B., Rachev, B. (eds) Advances in Databases and Information Systems. ADBIS 2007. Lecture Notes in Computer Science, vol 4690. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75185-4_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-75185-4_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-75184-7

  • Online ISBN: 978-3-540-75185-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics