Advertisement

Evolved RBF Networks for Time-Series Forecasting and Function Approximation

  • V. M. Rivas
  • P. A. Castillo
  • J. J. Merelo
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2439)

Abstract

An evolutionary algorithm with specific operators has been developed to automatically find Radial basis Functions Neural Networks that solve a given problem. The evolutionay algorithm optimizes all the parameters related to the neural network architecture, i.e., number of hidden neurons and their configuration. A set of parameters to run the algorithm is found and tested against a set of different problems about Time-series forecasting and function approximation. Results obtained are compared with those yielded by similar methods.

Keywords

RBF evolutionary algorithms EO functional estimation time series forecasting 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    A. V. Adamopoulos, E. F. Georgopoulos, S. D. Likothanassis, and P. A. Anninos. Forecasting the MagnetoEncephaloGram (MEG) of Epilectic Patients Using Genetically Optimized Neural Networks. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO’99, volume 2, pages 1457–1462. Morgan-Kaufmann Publ., July 1999.Google Scholar
  2. 2.
    C. M Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995. ISBN 0-19-853849-9 (hardback) or 0-19-853864-2 (paperback).Google Scholar
  3. 3.
    D. S. Broomhead and D. Lowe. Multivariable Functional Interpolation and Adaptative Networks. Complex Systems, 11:321–355, 1988.MathSciNetGoogle Scholar
  4. 4.
    B. Carse and T. C. Fogarty. Fast evolutionary learning of minimal radial basis function neural networks using a genetic algorithm. In T. C. Fogarty, editor, Proceedings of the Evolutionary Computing, AISB Workshop’96, pages 1–22. Springer-Verlag, 1996.Google Scholar
  5. 5.
    S. Chen et al. Orthogonal Least Squares algorithm for constructing Radial Basis Function Networks. IEEE Trunsactions on Neural Networks, 2(2):302–309, 1991.CrossRefGoogle Scholar
  6. 6.
    S. Chen et al. Regularised Orthogonal Least Squares Learning for Radial basis function Networks. Submitted to International Journal Control, 1995.Google Scholar
  7. 7.
    B. Fritzke. Supervised learning with growing cell structures. In J. D. Cowan, G. Tesauro, and J. Aspector, editors, Advances in Neural Information Processing Systems, volume 6, pages 255–262. Morgan Kaufmann, 1994.Google Scholar
  8. 8.
    J. González, I. Rojas, H. Pomares, and J. Ortega. Rnf neural networks, multiobjective optimization and time series forecasting. Lecture Notes in Computer Science, (2084):498–505, 2001.Google Scholar
  9. 9.
    J. González, I. Rojas, H. Pomares, and M. Salmerón. Expert mutation operators for the evolution of radial basis function neural networks. Lecture Notes in Computer Science, 2084:538–545, June 2001.Google Scholar
  10. 10.
    A. Leonardis and H. Bischof. And efficient MDL-based construction of RBF networks. Neural Networks, 11:963–973, 1998.CrossRefGoogle Scholar
  11. 11.
    J. J. Merelo and A. Prieto. G-LVQ a combination of genetic algorithms and LVQ. In D. W. Pearson, N. C. Steele, and R. F. Albrecht, editors, Artificial Neural Nets and Genetic Algorithms. Springer-Verlag, 1995.Google Scholar
  12. 12.
    Zbigniew Michalewicz. Genetic algorithms + data structures = evolution programs. Springer-Verlag, NewYork USA, 3 edition, 1999.Google Scholar
  13. 13.
    J. E. Moody and C. Darken. Fast learning in networks of locally tuned processing units. Neural Computation, 2(1):281–294, 1989.CrossRefGoogle Scholar
  14. 14.
    M. J. L. Orr. Regularisation in the Selection of Radial Basis Function Centres. Neural Computation, 7(3):606–623, 1995.CrossRefGoogle Scholar
  15. 15.
    J. Platt. A resource-allocating network for function interpolation. Neural Computation, 3(2):213–225, 1991.MathSciNetCrossRefGoogle Scholar
  16. 16.
    H. Pomares, I. Rojas, J. Ortegaa, J. González, and A. Prieto. A systematic approach to self-generating fuzzy rule-table for function approximation. IEEE Trans. Syst., Man., and Cyber., 30:431–447, 2000.CrossRefGoogle Scholar
  17. 17.
    W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. Numerical Recipes in C. Cambridge University Press, 2nd edition, 1992.Google Scholar
  18. 18.
    P. A. Castillo; J. Carpio; J. J. Merelo; V. Rivas; G. Romero; A. Prieto. Evolving multilayer perceptrons. Neural Processing Letters, 12:115–127, October 2000.Google Scholar
  19. 19.
    P. A. Castillo; J. J. Merelo; V. Rivas; G. Romero; A. Prieto. G-Prop: Global Optimization of Multilayer Perceptrons using GAs. Neurocomputing, Vol. 35/1–4, pp.149–163, 2000.zbMATHCrossRefGoogle Scholar
  20. 20.
    A. J. Rivera, J. Ortega, M. J. del Jesus, and J. Gonzalez. Aproximación de funciones con evolución difusa mediante cooperación y competición de rbfs. In Actas del I Congreso Español de Algoritmos Evolutivos y Bioinspirados, AEB’02, pages 507–514, February 2002.Google Scholar
  21. 21.
    J. J. Merelo; M. G. Arenas; J. Carpio; P. Castillo; V. M. Rivas; G. Romero; M. Schoenauer. Evolving objects. pages 1083–1086, 2000. ISBN: 0-9643456-9-2.Google Scholar
  22. 22.
    B. A. Whitehead and T. D. Choate. Cooperative-Competitive Genetic Evolution of Radial Basis Function Centers and Widths for Time Series Prediction. IEEE Transactions on Neural Network, 7(4):869–880, July 1996.CrossRefGoogle Scholar
  23. 23.
    X. Yao. Evolving artificial neural networks. Proceedings of the IEEE, 87(9):1423–1447, 1999.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • V. M. Rivas
    • 1
  • P. A. Castillo
    • 2
  • J. J. Merelo
    • 2
  1. 1.Dpto. InformáticaUniv. de JaénJaénSpain
  2. 2.Dpto. de Arquitectura y Tecnología de ComputadoresUniv. de Granada Fac. de CienciasGranadaSpain

Personalised recommendations