Advertisement

A Genetic Designed Beta Basis Function Neural Network for Approximating Multi-Variables Functions

  • Chaouki Aouiti
  • Adel M. Alimi
  • Aref Maalej

Abstract

We propose in this paper a new genetic algorithm for Beta basis function neural networks (BBFNN). The properties of this ’genetic algorithm are the representation used and the ability to obtain the optimal structure of the BBFNN for approximating a multi-variable function.

Each network is coded as a matrix for which the number of rows is equal to the number of parameters in the function. The genetic algorithm operators change the number of neurons in the hidden layer. Some applications to functions with one and two variables are considered to demonstrate the performance of the BBFNN and of their genetic algorithm based design.

Keywords

Genetic Algorithm Hide Layer Mutation Operator Crossover Operator Beta Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Alimi, M.A.: The Beta Fuzzy System: Approximation of Standard Membership Functions. Proc. 17ème Journées Tunisiennes d’Électrotechnique et d’Automatique: JTEA’97, Nabeul, Tunisia, Nov., 1, 1997, 108–112.Google Scholar
  2. [2]
    Alimi, M.A.: Beta Fuzzy Basis Functions for the Design of Universal Robust Neuro-Fuzzy Controllers. Proc. Sérninaire sur la Commande Robuste & ses Applications: SCRA’97, Nabeul, Tunisia, Feb., 1997, C1–C5.Google Scholar
  3. [3]
    Aouiti, C., Alimi, M.A. and Maalej, A.: Genetic Algorithms to Construct Beta Neuro-Fuzzy Systems. Proc. Int. Conf. Artificial & Computational Intelligence for Decision, Control & Automation: ACIDCA’2000, Monastir, Tunisia, March, 2000, 88–93.Google Scholar
  4. [4]
    Janson, D. J. and Frenzel, J. F.: Application of genetic algorithms to the training of higher order neural networks. J. Syst. Eng., 2.272–276 (1992).Google Scholar
  5. [5]
    Montana, D. and Davis, L.: Training feedforward neural network neural networks using genetic algorithms. in Proc. 11th Int. Joint Conf. Artificial Intelligence. San Mateo, CA: Morgan Kaufmann, 1989,. 762–767.Google Scholar
  6. [6]
    Porto, V. W., Fogel, D. B., and Fogel, L. J.: “Alternative neural network training methods”, IEEE Expert, 10, 16–22, (1995).CrossRefGoogle Scholar
  7. [7]
    Yao, X.: Evolving Artificial Neural Networks. IEEE Proc. Computational Intelligence,87, No. 9, 1423–1447 (1999).Google Scholar

Copyright information

© Springer-Verlag Wien 2001

Authors and Affiliations

  • Chaouki Aouiti
    • 1
  • Adel M. Alimi
    • 2
  • Aref Maalej
    • 1
  1. 1.Department of Mechanical EgineeringLASEM: Laboratory of Electromechanical Systems University of Sfax ENISSfaxTunisia
  2. 2.REGIM: Research Group on Intelligent Machines, Department of Electrical EngineeringUniversity of Sfax, ENISSfaxTunisia

Personalised recommendations