Bayesian Inference for Basis Function Selection in Nonlinear System Identification using Genetic Algorithms

  • Visakan Kadirkamanathan
Conference paper
Part of the Fundamental Theories of Physics book series (FTPH, volume 70)


In this paper, an algorithm to determine the most probable model, amongst a large number of models formed with a set of wider class of basis functions, based on Bayesian model comparison is developed. The models consist of linear coefficients and nonlinear basis functions, which may themselves be parametrised, with different models constructed with different subsets of basis functions. By a suitable encoding, genetic algorithms are used to search over the space of all possible subsets of basis functions to determine the most probable model that describes the given observations.


Genetic Algorithm Basis Function Root Mean Square Error Radial Basis Function Radial Basis Function Neural Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    D. S. Broomhead & D. B. Lowe. Multivariable functional interpolation and adaptive networks. Complex Systems, Vol. 2, pp. 321 – 355, 1988.MathSciNetzbMATHGoogle Scholar
  2. [2]
    S. Chen, S. A. Billings & P. M. Grant. Nonlinear system identification using neural networks. International Journal of Control, Vol. 51, No. 6,pp: 1191 – 1214, 1990.MathSciNetzbMATHCrossRefGoogle Scholar
  3. [3]
    D. E. Goldberg. Genetic algorithms in search, optimization and machine learning. Addison-Wesley, MA: Reading, 1989.zbMATHGoogle Scholar
  4. [4]
    S. F. Gull. Developments in maximum entropy data analysis. In J. Skilling. (ed.) Maximum entropy and Bayesian methods, Kluwer, pp: 53–71, 1989.Google Scholar
  5. [5]
    V. Kadirkamanathan. A statistical inference based growth criterion for the RBF network. In Proc. IEEE Workshop on Neural Network for Signal Processing, pp. 12–21,1994.CrossRefGoogle Scholar
  6. [6]
    V. Kadirkamanathan, M. Niranjan & F. Fallside. Sequential adaptation of radial basis function neural network and its application to time-series prediction. In D. S. Touretzky (ed.) Advances in Neural Information Processing Systems 3, Morgan Kaufmann, CA: San Mateo, pp. ???–???,1991.Google Scholar
  7. [7]
    G. P. Liu & V. Kadirkamanathan. Multiobjective criteria for nonlinear model selection and identification with neural networks. Research Report No. 508, Department of Automatic Control & Systems Engineering, University of Sheffield, UK, March 1994.Google Scholar
  8. [8]
    D. J. C. MacKay. Bayesian Interpolation. Neural Computation, Vol. 4, No. 3,pp: 415–447, 1992.CrossRefGoogle Scholar
  9. [9]
    M. J. D. Powell. Radial basis functions for multivariable interpolation: A review. In J. C. Mason & M. G. Cox (eds.) Algorithms for approximation, Oxford University Press, Oxford, pp. 143–167, 1987.Google Scholar
  10. [10]
    J. D. Scharfer & D. Whitley (eds.). Combinations of genetic algorithms and neural networks. IEEE Computer Society Press, CA: Los Alamitos, 1992.Google Scholar
  11. [11]
    S. Sibisi. Bayesian Interpolation. In W. T. Grandy, Jr. (ed.) Maximum entropy and Bayesian methods, Kluwer, pp: 349–355, 1991.CrossRefGoogle Scholar
  12. [12]
    J. Skilling. On parameter estimation and quantified MaxEnt. In W. T. Grandy, Jr. (ed.) Maximum entropy and Bayesian methods,Kluwer, pp: 267–273, 1991.CrossRefGoogle Scholar

Copyright information

© Kluwer Academic Publishers 1996

Authors and Affiliations

  • Visakan Kadirkamanathan
    • 1
  1. 1.Department of Automatic Control & Systems EngineeringUniversity of SheffieldUK

Personalised recommendations