Advertisement

Data Reduction Algorithm for Machine Learning and Data Mining

  • Ireneusz Czarnowski
  • Piotr Jȩdrzejowicz
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5027)

Abstract

The paper proposes an approach to data reduction. The data reduction procedures are of vital importance to machine learning and data mining. To solve the data reduction problems the agent-based population learning algorithm was used. The proposed approach has been used to reduce the original dataset in two dimensions including selection of reference instances and removal of irrelevant attributes. To validate the approach the computational experiment has been carried out. Presentation and discussion of experiment results conclude the paper.

Keywords

Feature Selection Belief Revision Tabu List Reference Vector Solution Manager 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aarts, E.H.L., Korst, J.: Simulated Annealing and Boltzmann Machines. John Wiley, Chichester (1989)zbMATHGoogle Scholar
  2. 2.
    Cano, J.R., Herrera, F., Lozano, M.: On the Combination of Evolutionary Algorithms and Stratified Strategies for Training Set Selection in Data Mining. Pattern Recognition Letters. Elsevier, Amsterdam (in press, 2004)Google Scholar
  3. 3.
    Czarnowski, I., Jȩdrzejowicz, P.: An Approach to Instance Reduction in Supervised Learning. In: Coenen, F., Preece, A., Macintosh, A. (eds.) Research and Development in Intelligent Systems, vol. XX, pp. 267–282. Springer, London (2004)Google Scholar
  4. 4.
    Dash, M., Liu, H.: Feature Selection for Classification. Intelligence Data Analysis 1(3), 131–156 (1997)CrossRefGoogle Scholar
  5. 5.
    Duch, W.: Results - Comparison of Classification. Nicolaus Copernicus University (2002), http://www.is.umk.pl/projects/datasets.html
  6. 6.
    Glover, F.: Tabu search. Part I and II. ORSA Journal of Computing 1(3), Summer (1990) and 2(1) Winter (1990)Google Scholar
  7. 7.
    Ishibuchi, H., Nakashima, T., Nii, M.: Learning of Neural Networks with GA-based Instance Selection. In: IFSA World Congress and 20th NAFIPS International Conference, vol. 4, pp. 2102–2107 (2001)Google Scholar
  8. 8.
    Barbucha, D., Czarnowski, I., Jȩdrzejowicz, P., Ratajczak-Ropel, E., Wierzbowska, I.: JADE-Based A-Team as a Tool for Implementing Population-Based Algorithms. In: Chen, Y., Abraham, A. (eds.) Proceedings of the Sixth International Conference on Intelligent Systems Design and Applications (ISDA 2006), vol. 3, pp. 144–149. IEEE Computer Society, Los Alamitos (2006)CrossRefGoogle Scholar
  9. 9.
    Kohavi, R., John, G.H.: Wrappers for Feature Subset Selection. Artificial Intelligence 97(1-2), 273–324 (1997)zbMATHCrossRefGoogle Scholar
  10. 10.
    Meiri, R., Zahavi, J.: Using Simulated Annealing to Optimize the Feature Selection Problem in Marketing Applications. European Journal of Operational Research 17(3), 842–858 (2006)CrossRefGoogle Scholar
  11. 11.
    Merz, C.J., Murphy, M.: UCI Repository of Machine Learning Databases. University of California, Department of Information and Computer Science, Irvine, CA (1998), http://www.ics.uci.edu/mlearn/MLRepository.html Google Scholar
  12. 12.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, SanMateoGoogle Scholar
  13. 13.
    Raman, B., Ioerger, T.R.: Enhancing learning using feature and example selection. Journal of Machine Learning Research (in press, 2003)Google Scholar
  14. 14.
    Rozsypal, A., Kubat, M.: Selecting Representative Examples and Attributes by a Genetic Algorithm. Intelligent Data Analysis 7(4), 291–304 (2003)zbMATHGoogle Scholar
  15. 15.
    Skalak, D.B.: Prototype and Feature Selection by Sampling and Random Mutation Hill Climbing Algorithm. In: International Conference on Machine Learning, pp. 293–301 (1994)Google Scholar
  16. 16.
    Talukdar, S., Baerentzen, L., Gove, A., de Souza, P.: Asynchronous Teams: Co-operation Schemes for Autonomous. Computer-Based Agents. Technical Report EDRC 18-59-96, Carnegie Mellon University, Pittsburgh (1996)Google Scholar
  17. 17.
    Wroblewski, J.: Adaptacyjne metody klasyfikacji obiektów. PhD thesis, University of Warsaw, Warsaw (in Polish, 2001)Google Scholar
  18. 18.
    Wilson, D.R., Martinez, T.R.: Reduction Techniques for Instance-based Learning Algorithm. Machine Learning 33-3, 257–286 (2000)CrossRefGoogle Scholar
  19. 19.
    Zongker, D., Jain, A.: Algorithm for Feature Selection: An Evaluation. In: International Conference on Pattern Recognition, ICPR 1996, pp. 18–22 (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Ireneusz Czarnowski
    • 1
  • Piotr Jȩdrzejowicz
    • 1
  1. 1.Department of Information SystemsGdynia Maritime UniversityGdyniaPoland

Personalised recommendations