Advertisement

ICDSMLA 2019 pp 932-941 | Cite as

Improve the Efficiency of the Classifiers Using Resample Technique on Image Segmentation Dataset

  • G. Naga RamaDeviEmail author
  • M. Janga Reddy
  • D. Baswaraj
Conference paper
  • 2 Downloads
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 601)

Abstract

One of the most popular data mining techniques is classification which assigns collection items to classes. The main goal of classification is to accurately predict the target class for each data cases. One of the popular application areas of machine learning is IMAGE SEGMENTATION. The trained classifiers are used to extract relevant features of the target region [1]. In this paper, experiments are conducted using WEKA (Waikato Environment for Knowledge Analysis) tool. It is an open source. It contains the collection of machine learning algorithms for data mining purpose. We performed preprocessing with or without resample filter on the imbalanced image segmentation dataset and experiments are conducted with the most popular classifiers namely J48(C4.5), Naive Bayes, Random Forest and SMO on this resultant dataset [2]. The resample filter will adds feature subset of data samples to the imbalanced dataset. Finally, our methodology improves the performance of the classifiers with the resample filter.

Keywords

Classification Resample J48 Naïve Bayes Random forest SMO 

References

  1. 1.
    Liu H et al. Multi-task feature selection for advancing performance of image segmentation. IEEE, 978-1-5386-5218/3/18@2018.  https://doi.org/10.1109/icwapr.2018.8521328
  2. 2.
    Naga Ramadevi G, Usha Rani K, Lavanya D (2018) Ensemble based hybrid approach for breast cancer data. In: 1st international conference at CMR Institute of Technology on Jan 24th–25th, ICCCE 2018, Lecture notes in electrical engineering, vol 500.  https://doi.org/10.1007/978-981-13-0212-1_72
  3. 3.
    Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal 1:131–156CrossRefGoogle Scholar
  4. 4.
    Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml
  5. 5.
    Liu H, Cocea M, Ding W. Decision tree learning based feature evaluation and selection for image classification. In: International conference on machine learning and cybernetics, Ningbo, China, 9–12 July 2017, pp 569–574Google Scholar
  6. 6.
    Guyon (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182Google Scholar
  7. 7.
    Foster I, Hall MA, Smith LA. Feature selection for machine learning: comparing a correlation-based filter approach to the wrapper. In: Proceedings of the twelfth international Florida artificial intelligence research society conference, Orlando, Florida, 1–5 May 1999, pp 235–239Google Scholar
  8. 8.
    Guyon I (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182Google Scholar
  9. 9.
    Naga Rama Devi G, Usha Rani K. Ensemble based hybrid approach for breast cancer data. In: 1st international conference on communications and cyber physical engineering (ICCCE-2018) at CMRInstitute of Technology, Kandlakoya, Medchal Road, Hyderabad TP, India on January 24th–25thGoogle Scholar
  10. 10.
    Hall MA, Smith LA. Feature selection for machine learning: Comparing a correlation-based filter approach to the wrapper. In: Proceedings of the twelfth international Florida artificial intelligence research society conference, Orlando, Florida, 1–5 May 1999, pp 235–239Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Department of CSECMR Institute of TechnologyKandlakoya, MedchalIndia

Personalised recommendations