Abstract
In this paper, a new classification method that uses a clustering method to reduce the train set of K-Nearest Neighbor (KNN) classifier and also in order to enhance its performance is proposed. The proposed method is called Nearest Cluster Classifier (NCC). Inspiring the traditional K-NN algorithm, the main idea is to classify a test sample according to the tag of its nearest neighbor. First, the train set is clustered into a number of partitions. By obtaining a number of partitions employing several runnings of a simple clustering algorithm, NCC algorithm extracts a large number of clusters out of the partitions. Then, the label of each cluster center produced in the previous step is determined employing the majority vote mechanism between the class labels of the patterns in the cluster. The NCC algorithm iteratively adds a cluster to a pool of the selected clusters that are considered as the train set of the final 1-NN classifier as long as the 1-NN classifier performance over a set of patterns included the train set and the validation set improves. The selected set of the most accurate clusters are considered as the train set of final 1-NN classifier. After that, the class label of a new test sample is determined according to the class label of the nearest cluster center. Computationally, the NCC is about K times faster than KNN. The proposed method is evaluated on some real datasets from UCI repository. Empirical studies show an excellent improvement in terms of both accuracy and time complexity in comparison with KNN classifier.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Fix, E., Hodges, J.L.: Discriminatory analysis, nonparametric discrimination: Consistency properties. Technical Report 4, USAF School of Aviation Medicine, Randolph Field, Texas (1951)
Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inform. Theory IT-13(1), 21–27 (1967)
Hellman, M.E.: The nearest neighbor classification rule with a reject option. IEEE Trans. Syst. Man Cybern. 3, 179–185 (1970)
Fukunaga, K., Hostetler, L.: k-nearest-neighbor bayes-risk estimation. IEEE Trans. Information Theory 21(3), 285–293 (1975)
Dudani, S.A.: The distance-weighted k-nearest-neighbor rule. IEEE Trans. Syst. Man Cybern. SMC-6, 325–327 (1976)
Bailey, T., Jain, A.: A note on distance-weighted k-nearest neighbor rules. IEEE Trans. Systems, Man, Cybernetics 8, 311–313 (1978)
Bermejo, S., Cabestany, J.: Adaptive soft k-nearest-neighbour classifiers. Pattern Recognition 33, 1999–2005 (2000)
Jozwik, A.: A learning scheme for a fuzzy k-nn rule. Pattern Recognition Letters 1, 287–289 (1983)
Keller, J.M., Gray, M.R., Givens, J.A.: A fuzzy k-nn neighbor algorithm. IEEE Trans. Syst. Man Cybern. SMC-15(4), 580–585 (1985)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. John Wiley & Sons (2000)
Itqon, S.K., Satoru, I.: Improving Performance of k-Nearest Neighbor Classifier by Test Features. Springer Transactions of the Institute of Electronics, Information and Communication Engineers (2001)
Lam, L., Suen, C.Y.: Application of majority voting to pattern recognition: An analysis of its behavior and performance. IEEE Transactions on Systems, Man, and Cybernetics 27(5), 553–568 (1997)
Jain, A.K., Dubes, R.C.: Algorithms for Clustering Data. Prentice-Hall, Englewood Cliffs (1988)
Newman, C.B.D.J., Hettich, S., Merz, C.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLSummary.html
Wu, X.: Top 10 algorithms in data mining. In: Knowledge Information, pp. 22–24. Springer-Verlag London Limited (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Parvin, H., Mohamadi, M., Parvin, S., Rezaei, Z., Minaei, B. (2012). Nearest Cluster Classifier. In: Corchado, E., Snášel, V., Abraham, A., Woźniak, M., Graña, M., Cho, SB. (eds) Hybrid Artificial Intelligent Systems. HAIS 2012. Lecture Notes in Computer Science(), vol 7208. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-28942-2_24
Download citation
DOI: https://doi.org/10.1007/978-3-642-28942-2_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-28941-5
Online ISBN: 978-3-642-28942-2
eBook Packages: Computer ScienceComputer Science (R0)