Evolutionary Algorithm vs. Other Methods for Constructive Optimisation of RBF Network Kernels

  • Jan Koszlaga
  • Paweł Strumiłło
Conference paper
Part of the Advances in Soft Computing book series (AINSC, volume 19)


Three methods for optimising Radial Basis Function (RBF) neural network receptive field are compared in the paper, namely: gradient descent, simulation annealing, and evolutionary algorithm. An incremental RBF network training scheme is considered, i.e., in which RBF kernels are added one at the time and individually optimised. Algorithmic implementations of the tested optimisation methods for configuring the RBF receptive field are shown and their computation costs are compared. The considered optimisation methods yield excellent results for the classification benchmark of Iris flowers. For the genetic optimisation scheme three RBF Gaussian kernels are sufficient to achieve average classification accuracy of the Irises at the level of 98%.


Radial Basis Function Radial Basis Function Neural Network Radial Basis Function Kernel Network Error Average Classification Accuracy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Haykin S. (1994) Neural networks, a comprehensive foundation, Prentice Hall.MATHGoogle Scholar
  2. 2.
    Whitehead B.A., Choate T.D. (1996) Cooperative-competitive genetic evolution of radial basis function centers and widths for time series prediction, IEEE Trans. Neural Networks, 7, 4, 869–880CrossRefGoogle Scholar
  3. 3.
    Hong S,-G., Oh S.-K., Kim M.-S., Lee J.-J. (2001) Nonlinear time series modelling and prediction using Gaussian RBF network with evolutionary structure optimization, Electronics Letters, 37, 10, 639–640CrossRefGoogle Scholar
  4. 4.
    Chen S., Wu Y., Luk B.L. (1999) Combined genetic algorithm optimisation and regularised orthogonal least squares learning for radial basis function networks, IEEE Trans. Neural Networks, 10, 5, 1239–1243CrossRefGoogle Scholar
  5. 5.
    Michalewicz Z. (1996) Genetic algorithms + data structures = evolution programs (in Polish), WNT, WarszawaGoogle Scholar
  6. 6.
    Koszlaga J., Strumillo P. (2000) Rule extraction from trained Radial Basis Function neural networks, Colloquia in Artificial Intelligence, Poland, October 2000, 97–106Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Jan Koszlaga
    • 1
  • Paweł Strumiłło
    • 1
  1. 1.Institute of ElectronicsTechnical University of ŁódźŁódźPoland

Personalised recommendations