Selection of Informative Inputs Using Genetic Algorithms

  • Primož Potočnik
  • Igor Grabec
Conference paper


Modeling of processes with many input variables requires selection of informative inputs in order to construct less complex models with good generalization abilities. In this paper two feature selection methods are compared: mutual information (MI) based feature selection and genetic algorithm (GA) based feature selection. As a modeling structure a hybrid linear-neural model is used. The methods are applied to a case study: modeling of an industrial antibiotic fermentation process. It is shown that both feature selection methods can lead to similar results. 8s based feature selection can be applied to problems where only few data exist and MI can not be calculated. In GA based feature selection it is possibile to adjust the objective function in order to control the propperties of the method.


Genetic Algorithm Feature Selection Mutual Information Root Mean Square Feature Selection Method 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    W. S. Sarle, “How to measure importance of inputs?,” 1997. URI =
  2. [2]
    B. Bonnlander, “Nonparametric selection of input variables for connectionist learning”, PhD thesis, University of Colorado, 1996.Google Scholar
  3. [3]
    P. Potočnik and I. Grabec, “Neural-genetic system for modeling of antibiotic fermentation process,” in Proceedings of the International ICSC Symposium on Engineering of Intelligent Systems EIS′98, Volume 2, (Tenerife, Spain), pp. 307–313, 1998.Google Scholar
  4. [4]
    P. Potočnik, “Nonparametric modeling of a fermentation process,” Master’s thesis, University of Ljubljana, Faculty of Mechanical Engineering, Ljubljana, 1997. (in Slovenian).Google Scholar
  5. [5]
    J. Moody and C. J. Darken, “Fast learning in networks of locally-tuned processing units,” Neural Computation, vol. 1, pp. 281–294,1989.CrossRefGoogle Scholar
  6. [6]
    R. Batitti, “Using mutual information for selecting features in supervised neural net learning,” IEEE Transactions on Neural Networks, vol. 5, no. 4, pp. 537–550, 1994.CrossRefGoogle Scholar
  7. [7]
    M. J. Willis, C. D. Massimo, G. A. Montague, M. T. Tham, and A. J. Morris, “Artificial neural networks in process engineering,” IEE Proceedings–D, vol. 138, no. 3, pp. 256–266, 1990.Google Scholar
  8. [8]
    C. D. Massimo, G. A. Montague, M. J. Willis, M. H. Tham, and A. J. Morris, “Towards improved penicillin fermentation via artificial neural networks,” Computers & Chemical Engineering, vol. 16, no. 4, pp. 283–291, 1992.CrossRefGoogle Scholar
  9. [9]
    J. Thibault, V. V. Breusegem, and A. Chéruy, “On-line prediction of fermentation variables using neural networks,” Biotechnology and Bioengineering, vol. 36, pp. 1041–1048, 1990.CrossRefGoogle Scholar
  10. [10]
    D. Tsaptsinos and J. R. Leigh, “Modelling of a fermentation process using multi-layer per-ceptrons: Epochs vs pattern learning, sigmoid vs linear transfer function,” Journal of Microcomputer Applications, vol. 16, pp. 125–136, 1993.CrossRefGoogle Scholar
  11. [11]
    D. Tsaptsinos, R. Tang, and J. R. Leigh, “Neuroidentification of a biotechnological process: Issues and applications,” Neurocomputing, vol. 9, pp. 63–79, 1995.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Wien 1999

Authors and Affiliations

  • Primož Potočnik
    • 1
  • Igor Grabec
    • 1
  1. 1.Faculty of Mechanical EngineeringUniversity of LjubljanaLjubljanaSlovenia

Personalised recommendations