Abstract
A deep belief net (DBN) with multi-stacked restricted Boltzmann machines (RBMs) was proposed by Hinton and Salakhutdinov for reducing the dimensionality of data in 2006. Comparing to the conventional methods, such as the principal component analysis (PCA), the superior performance of DBN received the most attention by the researchers of pattern recognition, and it even brought out a new era of artificial intelligence (AI) with a keyword “deep learning” (DL). Deep neural networks (DNN) such as DBN, deep auto-encoders (DAE), and convolutional neural networks (CNN) have been successfully applied to the fields of dimensionality reduction, image processing, pattern recognition, etc., nevertheless, there are more AI disciplines in which they could be applied such as computational cognition, behavior decision, forecasting, and others. Furthermore, the architectures of conventional deep models are usually handcrafted, i.e., the optimization of the structure of DNN is still a problem. In this chapter, we mainly introduce how DBNs were firstly adopted to time series forecasting systems by our original studies, and two kinds of heuristic optimization methods for structuring DBNs are discussed: particle swarm optimization (PSO), a well-known method in swarm intelligence; and random search (RS), which is a simpler and useful algorithm for high dimensional hyper-parameter exploration.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Mnih, V., et al.: Human-level control through deep reinforcement learning. Nature 518, 529–533 (2015)
Sivler, D., et al.: Mastering the game of go with deep neural networks and tree search. Nature 529, 484–489 (2016)
Sivler, D., et al.: Mastering the game of go without human knowledge. Nature 550, 354–359 (2017)
Williams, R.J., Simple statistical gradient following algorithms for connectionist reinforcement learning. Mach. Learn. 8, 229–256 (1992)
Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313, 504–507 (2006)
Hinton, G.E., Osindero, S., The, Y.W.: A faster learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1544 (2006)
Roux, N.L., Bengio, Y.: Representational power of restricted Boltzmann machines and deep belief networks. Neural Comput. 20(6), 1631–1649 (2008)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Proceedings of the 25th International Conference on Neural Information Processing Systems (NIPS’12), vol.1, pp. 1097–1105 (2012)
IMAGENET Large Scale Visual Recognition Challenge (2012). http://www.image-net.org/challenges/LSVRC/2012/
Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)
Ba, J., Frey, B.: Adaptive dropout for training deep neural networks. In: Advances in Neural Information Processing Systems (NIPS2013), vol. 26, pp. 3084–3092 (2013)
Nowakowski, G., Dorogyy, Y., Doroga-Ivaniuk, O.: Neural network structure optimization algorithm. J. Autom. Mob. Robot. Intell. Syst. 12(1), 5–13 (2018)
Kuremoto, T., Kimura, S., Kobayashi, K., Obayashi M.: Time series forecasting using restricted Boltzmann machine. In: Proceedings of the 8th International Conference on Intelligent Computing (ICIC 2012). Communications in Computer and Information Science (CCIS), vol. 304, pp. 17–22. Springer, Berlin (2012)
Kuremoto, T., Kimura, S., Kobayashi, K., Obayashi, M.: Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing 137(5), 47–56 (2014)
Kuremoto, T., Hirata, T., Obayashi, M., Mabu, S., Kobayashi, K.: Forecast chaotic time series data by DBNs. In: Proceedings of the 7th International Congress on Image and Signal Processing (CISP 2014), pp. 1304–1309 (2014)
Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)
Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S.: Time series prediction using DBN and ARIMA. In: Proceedings of International Conference on Computer Application Technologies (CCATS 2015), pp. 24–29 (2015)
Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S., Kobayashi, K.: Deep belief network using reinforcement learning and its applications to time series forecasting. In: Proceedings of International Conference on Neural Information Processing (ICONIP’ 16). Lecture Notes in Computer Science (LNCS), vol. 9949, pp. 30–37. Springer, Berlin (2016)
Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S., Kobayashi, K.: Forecasting real time series data using deep belief net and reinforcement learning. J. Robot. Netw. Artif. Life 4(4), 260–264 (2018). https://doi.org/10.2991/jrnal.2018.4.4.1
Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint:1409.15566, CISP2014
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770–778 (2016)
Wang, B., Sun, Y., Xue, B., Zhang, M.: Evolving deep convolutional neural networks by variable-length particle swarm optimization for image classification. In: 2018 IEEE Congress on Evolutionary Computation (CEC), pp. 1514–1521 (2018)
Fernandes Jr., F.E., Yen, G.G.: Particle swarm optimization of deep neural networks architectures for image classification. Swarm Evol. Comput. 49, 62–73 (2019)
Lorenz, E.N.: Deterministic nonperiodic flow. J. Atmos. Sci. 20, 130–141 (1963)
Henon, M.: A two-dimensional mapping with a strange attractor. Commun. Math. Phys. 50(1), 69–77 (1976)
Lendasse, A., Oja, E., Simula, O., Verleysen, M.: Time series prediction competition: the CATS benchmark. In: Proceedings of international joint conference on neural networks (IJCNN’04), pp. 1615–1620 (2004)
Lendasse, A., Oja, E., Simula, O., Verleysen, M.: Time series prediction competition: the CATS benchmark. Neurocomputing 70, 2325–2329 (2007)
Tieleman, T.: Training restricted Boltzmann machines using approximations to the likelihood gradient. In: Proceedings of the 25 International Conference on Machine Learning (ICML ’08), pp. 1064–1071 (2008)
Kuremoto, T., Tokuda, S., Obayashi, M., Mabu, S., Kobayashi, K.: An experimental comparison of deep belief nets with different learning methods. In: Proceedings of 2017 RISP International Workshop on Nonlinear Circuits, Communications and Signal Processing (NCSP 2017), pp. 637–640 (2017)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representation by back-propagating errors. Nature 232(9), 533–536 (1986)
NN3: http://www.neural-forecasting-competition.com/NN3/index.htm
Kuremoto, T., Obayashi, M., Kobayashi, M.: Neural forecasting systems. In: Weber, C., Elshaw, M., Mayer, N.M. (eds.) Reinforcement Learning, Theory and Applications, Chapter 1, pp. 1–20 (2008). InTech
Box, G.E.P., Pierce, D.A.: Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. J. Am. Stat. Assoc. 65(332), 1509–1526 (1970)
Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S., Kobayashi, K.: A novel approach to time series forecasting using deep learning and linear model. IEEJ Trans. Electron. Inf. Syst. 136(3), 248–356 (2016, in Japanese)
Kimura, H., Kobayashi, S.: Reinforcement learning for continuous action using stochastic gradient ascent. In: Proceedings of 5th Intelligent Autonomous Systems (IAS-5), pp. 288–295 (1998)
Kuremoto, T., Hirata, T., Obayashi, M., Mabu, S., Kobayashi, K.: Training deep neural networks with reinforcement learning for time series forecasting. In: Time Series Analysis - Data, Methods, and Applications, InTechOpen (2019)
Aalto University Applications of Machine Learning Group Datasets. 〈http://research.ics.aalto.fi/eiml/datasets.shtml〉 (01-01-17)
Hyndman, R.J.: Time Series Data Library (TSDL) (2013). 〈http://robjhyndman.com/TSDL/〉 (01-01-13)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Kuremoto, T., Hirata, T., Obayashi, M., Kobayashi, K., Mabu, S. (2020). Search Heuristics for the Optimization of DBN for Time Series Forecasting. In: Iba, H., Noman, N. (eds) Deep Neural Evolution. Natural Computing Series. Springer, Singapore. https://doi.org/10.1007/978-981-15-3685-4_5
Download citation
DOI: https://doi.org/10.1007/978-981-15-3685-4_5
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-3684-7
Online ISBN: 978-981-15-3685-4
eBook Packages: Computer ScienceComputer Science (R0)