Industrial Time Series Prediction

  • Jun Zhao
  • Wei Wang
  • Chunyang Sheng
Part of the Information Fusion and Data Science book series (IFDS)


Time series prediction is a significant way for forecasting the variables involved in industrial process, which usually identifies the latent rules hidden behind the time series data of the variables by means of auto-regression. In this chapter we introduce the phase space reconstruction technique, which aims to construct the training dataset for modeling, and then a series of data-driven machine learning methods are provided for time series prediction, where some well-known artificial neural networks (ANNs) models are introduced, and a dual estimation-based echo state network (ESN) model is particularly proposed to simultaneously estimate the uncertainties of the output weights and the internal states by using a nonlinear Kalman-filter and a linear one for noisy industrial time series. In addition, the kernel based methods, including Gaussian processes (GP) model and support vector machine (SVM) model, are also presented in this chapter. Specifically, an improved GP-based ESN model is proposed for time series prediction, in which the output weights in ESN modeled by using GP avoids the ill-conditioned phenomenon associated with the generic ESN version. A number of case studies related to industrial energy system are provided to validate the performance of these methods.


Time series Auto-regression Phase space reconstruction Embedding dimensionality Linear regression Probabilistic Kernel Echo State Network Gaussian process Support vector machine LSSVM Dual estimation Sample selection Marginal distribution Bayesian Posterior distributions Industrial energy system 


  1. 1.
    Takens, F. (1981). Detecting strange attractors in turbulence. Lecture Notes in Math, 898, 361–381.MathSciNetzbMATHGoogle Scholar
  2. 2.
    Kennel, M. B., Brown, R., & Abarbanel, H. D. (1992). Determining embedding dimension for phase-space reconstruction using a geometrical construction. Physical Review A Atomic Molecular & Optical Physics, 45(6), 3403–3411.CrossRefGoogle Scholar
  3. 3.
    Cao, L. (1997). Practical method for determining the minimum embedding dimension of a scalar time series. Physica D-Nonlinear Phenomena, 110(1-2), 43–50.CrossRefGoogle Scholar
  4. 4.
    Fraser, A. M., & Swinney, H. L. (1986). Independent coordinates for strange attractors from mutual information. Physical Review A General Physics, 33(2), 1134.MathSciNetCrossRefGoogle Scholar
  5. 5.
    Kim, H. S., Eykholt, R., & Salas, J. D. (1999). Nonlinear dynamics, delay times, and embedding windows. Physica D Nonlinear Phenomena, 127(1–2), 48–60.CrossRefGoogle Scholar
  6. 6.
    Brock, W. A., Hsieh, D. A., & LeBaron, B. (1991). Nonlinear dynamics, chaos, and instability: Statistical theory and economic evidence. Cambridge: MIT Press.Google Scholar
  7. 7.
    Han, M., & Xu, M. (2018). Laplacian echo state network for multivariate time series prediction. IEEE Transactions on Neural Networks and Learning System, 29(1), 238–244.MathSciNetCrossRefGoogle Scholar
  8. 8.
    Bishop, C. M. (2006). Pattern recognition and machine learning (Information Science and Statistics). New York: Springer.zbMATHGoogle Scholar
  9. 9.
    Zhao, J., Liu, Q., Wang, W., et al. (2012). Hybrid neural prediction and optimized adjustment for coke oven gas system in steel industry. IEEE Transactions on Neural Networks and Learning Systems, 23(3), 439–450.CrossRefGoogle Scholar
  10. 10.
    An, S., Liu, W., & Venkatesh, S. (2007). Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognition, 40(8), 2154–2162.CrossRefGoogle Scholar
  11. 11.
    Hong, W. G., Feng, Q., Yan, C. L., Wen, L. D., & Lu, W. (2008). Identification and control nonlinear systems by a dissimilation particle swarm optimization-based Elman neural network. Nonlinear Analysis Real World Applications, 9, 1345–1360.MathSciNetCrossRefGoogle Scholar
  12. 12.
    Li, X., Chen, Z. Q., & Yuan, Z. Z. (2000). Nonlinear stable adaptive control based upon Elman networks. Applied Mathematics–A Journal of Chinese Universities, Series, B15, 332–340.MathSciNetzbMATHGoogle Scholar
  13. 13.
    Liou, C. Y., Huang, J. C., & Yang, W. C. (2008). Modeling word perception using the Elman network. Neurocomputing, 71, 3150–3157.CrossRefGoogle Scholar
  14. 14.
    Köker, R. (2005). Reliability-based approach to the inverse kinematics solution of robots using Elman’s networks. Engineering Applications of Artificial Intelligence, 18(6), 685–693.CrossRefGoogle Scholar
  15. 15.
    Welch, G., & Bishop, G. (1995). An introduction to the Kalman filter, Technical Report TR 95-041. University of North Carolina, Department of Computer Science.Google Scholar
  16. 16.
    Jaeger, H. (2002). Tutorial on training recurrent neural networks, covering BPTT, RTRL, EKF and “Echo State Network” approach, Technical Report GMD Report 159. German National Research Center for Information Technology.Google Scholar
  17. 17.
    Farkaš, I., Bosák, R., & Gergeľ, P. (2016). Computational analysis of memory capacity in echo state networks. Neural Networks, 83, 109–120.CrossRefGoogle Scholar
  18. 18.
    Jaeger, H., & Haas, H. (2004). Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304(5667), 78–80.CrossRefGoogle Scholar
  19. 19.
    Jaeger, H. (2005). Reservoir riddles: Suggestions for echo state network research (pp. 1460–1462). In Proceedings of the International Joint Conference on Neural Networks.Google Scholar
  20. 20.
    Shi, Z. W., & Han, M. (2007). Support vector echo-state machine for chaotic time-series prediction. IEEE Transactions on Neural Network, 18(2), 359–372.CrossRefGoogle Scholar
  21. 21.
    Liu, Y., Zhao, J., & Wang, W. (2009). Improved echo state network based on data-driven and its application in prediction of blast furnace gas output. Acta Automatica Sinica., 35, 731–738.CrossRefGoogle Scholar
  22. 22.
    Golub, G. H., & van Loan, C. F. (1983). Matrix computations. Baltimore: The Johns Hopkins University Press.zbMATHGoogle Scholar
  23. 23.
    Saxén, H., & Pettersson, F. (2005). A simple method for selection of inputs and structure of feedforward neural networks. Computers & Chemical Engineering, 30(6), 1038–1045.Google Scholar
  24. 24.
    Saxen, H., & Pettersson, F. (2007). Nonlinear prediction of the hot metal silicon content in the blast furnace. Transactions of the Iron & Steel Institute of Japan, 47(12), 1732–1737.CrossRefGoogle Scholar
  25. 25.
    Wan, E. A., & Nelson, A. T. (2001). Dual extended Kalman filter methods. In S. Haykin (Ed.), Kalman filtering and neural networks (pp. 123–174). Chichester: Wiley.CrossRefGoogle Scholar
  26. 26.
    Vapnik, V. (1995). The nature of statistical learning theory. New York: Springer.CrossRefGoogle Scholar
  27. 27.
    Bennett, K. P., & Mangasarian, O. L. (1992). Robust linear programming discrimination of two linearly inseparable sets. Optimization Methods and Software, 1, 23–34.CrossRefGoogle Scholar
  28. 28.
    Cortes, C., & Vapnik, V. (1995). Support vector networks. Machine Learning, 20, 273–297.zbMATHGoogle Scholar
  29. 29.
    Fletcher, R. (1989). Practical methods of optimization. New York: Wiley.zbMATHGoogle Scholar
  30. 30.
    Schölkopf, B., & Smola, A. J. (2002). Learning with kernels. Cambridge: MIT Press.zbMATHGoogle Scholar
  31. 31.
    Gestel, V., Suykens, J. A. K., et al. (2001). Financial time series prediction using least squares support vector machines within the evidence framework. IEEE Transactions on Neural Networks, 12(4), 809–821.CrossRefGoogle Scholar
  32. 32.
    Suykens, J., & Vandewalle, J. (1999). Least squares support vector machines classifiers. Neural Processing Letters, 9(3), 293–300.CrossRefGoogle Scholar
  33. 33.
    Diykh, M., Li, Y., & Wen, P. (2017). Classify epileptic EEG signals using weighted complex networks based community structure detection. Expert Systems with Applications, 90(30), 87–100.CrossRefGoogle Scholar
  34. 34.
    Zhao, J., Wang, W., Pedrycz, W., et al. (2012). Online parameter optimization-based prediction for converter gas system by parallel strategies. IEEE Transactions on Control Systems Technology, 20(3), 835–845.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Jun Zhao
    • 1
  • Wei Wang
    • 1
  • Chunyang Sheng
    • 2
  1. 1.Dalian University of TechnologyDalianChina
  2. 2.Shandong University of Science and TechnologyQingdaoChina

Personalised recommendations