Advertisement

Feature Selection Based on Information Theory Filters

  • Włodzisław Duch
  • Jacek Biesiada
  • Tomasz Winiarski
  • Karol Grudziński
  • Krzysztof Grąbczewski
Part of the Advances in Soft Computing book series (AINSC, volume 19)

Abstract

Feature selection is an essential component in all data mining applications. Ranking of futures was made by several inexpensive methods based on information theory. Accuracy of neural, similarity based and decision tree classifiers calculated with reduced number of features. Comparison with computationally more expensive feature elimination methods was made.

Keywords

Feature Selection Mutual Information Feature Selection Method Feature Ranking Decision Tree Classifier 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Duch W, Grudzinski K (1999) The weighted k-NN method with selection of features and its neural realization, 4th Conference on Neural Networks and Their Applications, Zakopane, May 1999, pp. 191–196Google Scholar
  2. 2.
    Battiti R. (1991) Using mutual information for selecting features in supervised neural net learning. IEEE Transaction on Neural Networks 5, 537–550CrossRefGoogle Scholar
  3. 3.
    Jankowski N, Kadirkamanathan V. (1997) Statistical control of RBF-like networks for classification. 7th Int. Conf. on Artificial Neural Networks, Lausanne, Switzerland, Springer Verlag, pp. 385–390Google Scholar
  4. 4.
    Setiono R, Liu H. (1996) Improving Backpropagation learning with feature selection. Applied Intelligence: The International Journal of Artifical Intelligence, Neural Networks, and Complex Problem-Solving Technologies 6, 129–139Google Scholar
  5. 5.
    Kohavi R. (1995) Wrappers for performance enhancement and oblivious decision graphs. PhD thesis, Dept. of Computer Science, Stanford UniversityGoogle Scholar
  6. 6.
    Duch W, Adamczak R, Grabczewski K, A new methodology of extraction, optimization and application of crisp and fuzzy logical rules. IEEE Transactions on Neural Networks 12 (2001) 277–306CrossRefGoogle Scholar
  7. 7.
    Grgbczewski K. Duch W. (2000) The Separability of Split Value Criterion, 5th Conference on Neural Networks and Soft Computing, Zakopane, Poland, pp. 201–208Google Scholar
  8. 8.
    Duch W, Diercksen G.H.F. (1995) Feature Space Mapping as a universal adaptive system, Computer Physics Communications 87, 341–371MATHCrossRefGoogle Scholar
  9. 9.
    Witten I.H, Frank E. (2000) Data mining. Morgan Kaufmann Publishers, San FranciscoGoogle Scholar
  10. 10.
    Swets J.A. (1988) Measuring the accuracy of diagnostic systems. Science 240, 1285–93MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Włodzisław Duch
    • 1
  • Jacek Biesiada
    • 2
  • Tomasz Winiarski
    • 1
  • Karol Grudziński
    • 1
  • Krzysztof Grąbczewski
    • 1
  1. 1.Department of InformaticsNicholas Copernicus UniversityToruńPoland
  2. 2.Department of Electrotechnology, Division of Computer MethodsThe Silesian University of TechnologyKatowicePoland

Personalised recommendations