Advertisement

Time Series Forecasting Using Restricted Boltzmann Machine

  • Takashi Kuremoto
  • Shinsuke Kimura
  • Kunikazu Kobayashi
  • Masanao Obayashi
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 304)

Abstract

In this study, we propose a method for time series prediction using restricted Boltzmann machine (RBM), which is one of stochastic neural networks. The idea comes from Hinton & Salakhutdinov’s multilayer “encoder” network which realized dimensionality reduction of data. A 3-layer deep network of RBMs is constructed and after pre-training RBMs using their energy functions, gradient descent training (error back propagation) is adopted to execute fine-tuning. Additionally, to deal with the problem of neural network structure determination, particle swarm optimization (PSO) is used to find the suitable number of units and parameters. Moreover, a preprocessing, “trend removal” to the original data, was also performed in the forecasting. To compare the proposed predictor with conventional neural network method, i.e., multi-layer perceptron (MLP), CATS benchmark data was used in the prediction experiments.

Keywords

time series forecasting restricted Boltzmann machine multilayer perceptron CATS benchmark 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Crone, S., Nikolopoulos, K.: Results of the NN3 neural network forecasting competition. In: The 27 th International Symposium on Forecasting, Program, vol. 129 (2007)Google Scholar
  2. 2.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning Representation by Back-Propagating Errors. Nature 232(2), 533–536 (1986)CrossRefGoogle Scholar
  3. 3.
    Hinton, G.E., Salakhutdinov, R.R.: Reducing the Dimensionality of Data with Neural Networks. Science 313(4), 504–507 (2006)MathSciNetzbMATHCrossRefGoogle Scholar
  4. 4.
    Roux, N.L., Bengio, Y.: Representational Power of Restricted Boltzmann Machines and Deep Belief Networks. Neural Computation 20(2), 1631–1649 (2008)MathSciNetzbMATHCrossRefGoogle Scholar
  5. 5.
    Hinton, G.E., Sejnowski, T.J.: Learning and Relearning in Boltzmann Machines. In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Foundations, vol. 1. MIT Press, Cambridge (1986)Google Scholar
  6. 6.
    Ackley, D.H., Hinton, G.E., Sejnowski, T.J.: A Learning Algorithm for Boltzmann Machines. Cognitive Science 9(1), 147–169 (1985)CrossRefGoogle Scholar
  7. 7.
    Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. In: IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)Google Scholar
  8. 8.
    Lendasse, A., Oja, E., Simula, O., Verleysen, M.: Time Series Prediction Competition: The CATS Benchmark. In: International Joint Conference on Neural Networks, pp. 1615–1620 (2004)Google Scholar
  9. 9.
    Lendasse, A., Oja, E., Simula, O., Verleysen, M.: Time Series Prediction Competition: The CATS Benchmark. Neurocomputing 70(2), 2325–2329 (2007)CrossRefGoogle Scholar
  10. 10.
    Box, G.E.P., Jenkins, G.: Time Series Analysis, Forecasting and Control. Cambridge University Press, Cambridge (1976)zbMATHGoogle Scholar
  11. 11.
    Zhang, G.P.: Time Series Forecasting Using a Hybrid ARIMA and Neural Network Model. Neurocomputing 50(2), 159–175 (2003)zbMATHCrossRefGoogle Scholar
  12. 12.
    Gardner, E., McKenzie, E.: Seasonal Exponential Smoothing with Damped Trends. Management Science 35(3), 372–376 (1989)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Takashi Kuremoto
    • 1
  • Shinsuke Kimura
    • 1
  • Kunikazu Kobayashi
    • 2
  • Masanao Obayashi
    • 1
  1. 1.Graduate School of Science and EngineeringYamaguchi UniversityUbeJapan
  2. 2.School of Information Science & TechnologyAichi Prefectural UniversityNagakuteJapan

Personalised recommendations