Automatic Feature Selection by Genetic Algorithms

  • Michael Eberhardt
  • Friedrich W. H. Kossebau
  • Andreas König
Conference paper


The efficient and automatic selection of features from an initial raw data set is an optimization task met in numerous applications fields, e.g., multivariate data classification, analysis, and visualization. The reduction of the variable number reduces the detrimental effects of the well-known curse of dimensionality. However, finding of the optimum solution in the selection process by exhaustive search is infeasible, as the underlying optimization problem is NP-complete. Thus, search heuristics are commonly applied to find acceptable solutions with a feasible computational effort. In this work, genetic algorithms are applied, based on dedicated nonparametric cost functions and multiobjective optimization. The method was implemented in our general QuickCog environment. For practical applications, competitive results were achieved.


Pareto Optimal Front Initial Array Automatic Feature Selection Underlying Optimization Problem Genetic String 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    K. Fukunaga, Introduction to Statistical Pattern Recognition. ACADEMIC PRESS, INC. Harcourt Brace Jovanovich, Publishers Boston San Diego New York London Sydney Tokyo Toronto, 1990.Google Scholar
  2. [2]
    J. Kittler, Feature Selection and Extraction. ACADEMIC PRESS, INC. Tzai. Y. Young King SunFu, Publishers Orlando San Diego New York Austin London Montreal Sydney Tokyo Toronto, 1986.Google Scholar
  3. [3]
    A. König, “Dimensionality Reduction Techniques for Multivariate Data Classification, Interactive Visualization, and Analysis — Systematic Feature Selection vs. Extraction,” in Proc. of 4th Int. Conf. on Knowledge-Based Intelligent Engineering Systems & Allied Technologies KES’2000, (University of Brighton, UK), pp. 44–56, August 2000.Google Scholar
  4. [4]
    G. W. Gates, “The Reduced Nearest Neighbour Rule,” in IEEE Transactions on Information Theory, vol. IT-1B, pp. 431–433, 1972.CrossRefGoogle Scholar
  5. [5]
    E. Zitzler and L. Thiele, “Multiobjective evolutionary algorithms: A comparative case study and the strength pareto approach,“ IEEE Transactions on Evolutionary Computation, vol. 3, pp. 257-271, Nov. 1999.Google Scholar
  6. [6]
    M. L. Raymer, W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain, “Dimensionality reduction using genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 4, pp. 164–171, July 2000.CrossRefGoogle Scholar
  7. [7]
    H. Wang, D. Hennecke, A. König, P. Windirsch, and M. Glesner, “Method for Estimating Various Operating States in a Single Stage Axial Compressor,” in AIAA Journal of Propulsion and Power 11(2), pp. 385–387,1995.CrossRefGoogle Scholar
  8. [8]
    A. König, A. Günther, A. Kröhnert, T. Grohmann, J. Düge, and M. Eberhardt, “Holistic Modelling and Parsimonious Design of Low-Power Integrated Vision and Cognition Systems,” in Proc. of 6th Int. Conf. on Soft Computing and Information/ Intelligent Systems IIZUKA’2000, (Iizuka, Fukuoka, Japan), pp. 710–717, October 2000.Google Scholar
  9. [9]
    A. König, “A Novel Supervised Dimensionality Reduction Technique by Feature Weighting for Improved Neural Network Classifier Learning and Generalization,” in Proc. of 6th Int. ConJ. on Soft Computing and Information/Intelligent Systems IIZUKA’2000, (Iizuka, Fukuoka, Japan), pp. 746–753, October 2000.Google Scholar
  10. [10]
    A. König, M. Eberhardt, and R. Wenzel, “QuickCog — HomePage,” in de/ koeniga/QuickCog.html, 2000.Google Scholar

Copyright information

© Springer-Verlag Wien 2001

Authors and Affiliations

  • Michael Eberhardt
  • Friedrich W. H. Kossebau
  • Andreas König
    • 1
  1. 1.Dresden University of TechnologyDresdenGermany

Personalised recommendations