Time-Series Prediction with Neural Networks: Combinatorial versus Sequential Approach

  • A. Dobnikar
  • M. Trebar
  • B. Petelin


In this paper, two different approaches of time-series prediction with neural networks are presented. The first is called combinatorial because it deals with a finite set of classes, obtained from the differences between several consequent function values. It is implemented through a modular neural network. The second describes time-series with interval functions or sequences of successive function values and is therefore a sequential approach, employing Kaiman neural gas networks. In the first case the future value (prediction) of an input vector depends on the classes (from input vectors, possibly together with the next values) obtained from learning the history of a time-series. In the second, based on the sequence of last input vector(s), the closest covering neuron (interval function) is defined, and is responsible for a future value calculation. A linear autoregressive method (AR) and multilayer perceptron (MLP) are used as references, and with the help of three different time-series, the efficiency of the suggested methods is given.


Neural Network Input Vector Interval Function Recurrent Neural Network Sequential Approach 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    A. Dobnikar. Kalman neural gas network for time series prediction. In Brain Processes, Theories and Models. MIT Press, 1995.Google Scholar
  2. [2]
    A. Dobnikar, A. Likar, and M. Trebar. Time-series Prediction with Online Correction of Kaiman Gain — A Connectionist Approach. Elsevier, 1995.Google Scholar
  3. [3]
    T.M. Martinetz et al. Neural-gas network for vector quantization and its application to time-series prediction. IEEE Trans, on Neural Networks, 4(4):558–569, 1993.CrossRefGoogle Scholar
  4. [4]
    S. Haykin. Neural Networks. MacMillan College Publishing Company, 1994.Google Scholar
  5. [5]
    G. Janacek and L. Swift. Time-Series Forecasting, Simulation, Applications. Ellis Horwood Limited, 1993.Google Scholar
  6. [6]
    T. Masters. Neural, Novel and Hybrid Algorithms for Time Series Prediction. John Wiley and Sons, 1995.Google Scholar
  7. [7]
    D.W. Pearson, A. Dobnikar, B. Petelin, and G. Dray. Estimating neural network based predictors for production processes. In Proc. CESA’96. Lille, Prance, 1996.Google Scholar
  8. [8]
    S. Shar, F. Palmieri, and F. Datum. Optimal filtering algorithms for fast learning in feedforward neural networks. Neural Networks, 5:779–787, 1992.CrossRefGoogle Scholar
  9. [9]
    M. Trebar and A. Dobnikar. Time series prediction using modular recurrent neural networks. In International Conference CESA96, Lille, France, 1996.Google Scholar

Copyright information

© Springer-Verlag Wien 1998

Authors and Affiliations

  • A. Dobnikar
    • 1
  • M. Trebar
    • 1
  • B. Petelin
    • 2
  1. 1.Faculty of Computer and Information ScienceUniversity of LjubljanaLjubljanaSlovenia
  2. 2.Tomos — InformatikaKoperSlovenia

Personalised recommendations