Skip to main content

Retracted: A Novel Voting Mathematical Rule Classification for Image Recognition

  • Conference paper
  • First Online:
Computational Science and Its Applications – ICCSA 2016 (ICCSA 2016)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9790))

Included in the following conference series:

Abstract

In machine learning, the accuracy of the system depends upon classification result. Classification accuracy plays an imperative role in various domains. Non-parametric classifier like K-Nearest Neighbor (KNN) is the most widely used classifier for pattern analysis. Besides its easiness, simplicity and effectiveness characteristics, the main problem associated with KNN classifier is the selection of number of nearest neighbors i.e. ‘k’ for computation. At present it is hard to find the optimal value of ‘k’ using any statistical algorithm, which gives perfect accuracy in terms of low misclassification error rate.

Motivated by prescribed problem, a new sample space reduction weighted voting mathematical rule (AVNM) is proposed for classification in machine learning. Proposed AVNM rule is also non-parametric in nature like KNN. AVNM uses the weighted voting mechanism with sample space reduction to learn and examine the predicted class label for unidentified sample. AVNM is free from any initial selection of predefined variable and neighbor selection as found in KNN algorithm. Proposed classifier also reduces the effect of outliers.

To verify the performance of the proposed AVNM classifier, experiments are made on 10 standard dataset taken from UCI database and one manually created dataset. Experimental result shows that the proposed AVNM rule outperforms KNN classifier and its variants. Experimentation results based on confusion matrix accuracy parameter proves higher accuracy value with AVNM rule.

Proposed AVNM rule is based on sample space reduction mechanism for identification of optimal number of nearest neighbor selections. AVNM results in better classification accuracy and minimum error rate as compared with state-of-art algorithm, KNN and its variants. Proposed rule automates the selection of nearest neighbor selection and improves classification rate for UCI dataset and manual created dataset.

This paper has been retracted as its contents were plagiarized from “AVNM: A Voting based Novel Mathematical Rule for Image Classification” by Ankit Vidyarthi and Namita Mittal, published by Elsevier Ltd. in 2015.

An erratum to this chapter can be found at 10.1007/978-3-319-42092-9_48

An erratum to this chapter can be found at http://dx.doi.org/10.1007/978-3-319-42092-9_48

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Fix, E., Hodges, J.L.: Discriminatory analysis, non parametric discrimination: consistency properties. Technical report No. 4, U.S. Air Force school of aviation Medicine, Randolf field Texas (1951)

    Google Scholar 

  2. Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)

    Article  MATH  Google Scholar 

  3. Wagner, T.: Convergence of the nearest neighbor rule. IEEE Trans. Inf. Theor. 17(5), 566–571 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  4. Hastie, T., Tibshirani, R., Friedman, T.: The Elements of Statistical Learning, Chap. 2, 2nd edn., pp. 9–38, springer, Heidelberg (2009)

    Google Scholar 

  5. Data Mining Algorithms in R. http://en.wikibooks.org/wiki/Data_Mining_Algorithms_In_R/Classification/kNN

  6. Keller, J.M., Gray, M.R., Givens, J.A.: A fuzzy K-nearest neighbor algorithm. IEEE Trans. Syst. Man Cybern. 15(4), 580–585 (1985)

    Article  Google Scholar 

  7. Shang, W., Huang, H.-K., Zhu, H., Lin, Y., Wang, Z., Qu, Y.: An improved kNN algorithm – fuzzy kNN. In: Hao, Y., Liu, J., Wang, Y.-P., Cheung, Y.-M., Yin, H., Jiao, L., Ma, J., Jiao, Y.-C. (eds.) CIS 2005. LNCS (LNAI), vol. 3801, pp. 741–746. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  8. Keller, J., Hunt, D.: Incorporating fuzzy membership functions into the perceptron algorithm. IEEE Trans. Pattern Anal. Mach. Intell. 7(6), 693–699 (1985)

    Article  Google Scholar 

  9. Song, Y., Huang, J., Zhou, D., Zha, H., Giles, C.: IKNN: informative K-Nearest neighbor pattern classification. In: Kok, J.N., Koronacki, J., Lopez de Mantaras, R., Matwin, S., Mladenič, D., Skowron, A. (eds.) PKDD 2007. LNCS (LNAI), vol. 4702, pp. 248–264. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  10. Parvin, H., Alizadeh, H., Bidgoli, B.M.: MKNN: modified K-nearest neighbor. In: Proceedings of the World Congress on Engineering and Computer Science (WCECS), pp. 22–25 (2008)

    Google Scholar 

  11. Dudani, S.A.: The distance weighted K-nearest neighbor rule. IEEE Trans. Syst. Man Cybern. 6, 325–327 (1976)

    Article  Google Scholar 

  12. kang, P., Cho, S.: Locally linear reconstruction for instanced based learning. J. Pattern Recogn. 41(11), 3507–3518 (2008)

    Article  MATH  Google Scholar 

  13. Wu, X.D., et al.: Top 10 algorithms in data mining. J. Knowl. Based Inf. Syst. 14, 1–37 (2008)

    Article  Google Scholar 

  14. Gou, J.: A novel weighted voting for K-nearest neighbor rule. J. Comput. 6(5), 833–840 (2011)

    Article  Google Scholar 

  15. Mitani, Y., Hamamoto, Y.A.: Local mean based nonparametric classifier. Pattern Recogn. Lett. 27, 1151–1159 (2006)

    Article  Google Scholar 

  16. Zeng, Y., Yang, Y., Zhao, L.: Pseudo nearest neighbor rule for pattern classification. J. Expert Syst. Appl. 36, 3587–3595 (2009)

    Article  Google Scholar 

  17. Wang, J., Neskovic, P., Cooper, L.N.: Neighborhood size selection in K-nearest neighbor rule using statistical confidence. J. Pattern Recogn. 39, 417–423 (2006)

    Article  MATH  Google Scholar 

  18. Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic press, Cambridge (1990)

    MATH  Google Scholar 

  19. Lichman, M.: UCI Machine Learning Repository, Irvine, CA: University of California, School of Information and Computer Science (2013). https://archive.ics.uci.edu/ml/datasets.html

  20. Vidyarthi, A., Mittal, N.: Utilization of shape and texture features with statistical feature selection mechanism for classification of malignant brain tumours in MR images. J. Biomed. Technol. Walter de Gruyter 59, 155–159 (2014)

    Google Scholar 

  21. Bhattacharya, G., et al.: Outlier detection using neighborhood rank difference. J. Pattern Recogn. 60, 24–31 (2015)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sadrollah Abbasi .

Editor information

Editors and Affiliations

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Abbasi, S., Shahriari, A., Nemati, Y. (2016). Retracted: A Novel Voting Mathematical Rule Classification for Image Recognition. In: Gervasi, O., et al. Computational Science and Its Applications – ICCSA 2016. ICCSA 2016. Lecture Notes in Computer Science(), vol 9790. Springer, Cham. https://doi.org/10.1007/978-3-319-42092-9_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-42092-9_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-42091-2

  • Online ISBN: 978-3-319-42092-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics