Skip to main content

Time Series Forecasting on Engineering Systems Using Recurrent Neural Networks

  • Conference paper
  • First Online:
Advanced Data Mining and Applications (ADMA 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10086))

Included in the following conference series:

Abstract

Modern large scale processing and manufacturing systems cover a wide array and large number of assets that need to work together to ensure that plant is generating output reliably and at the desired yield rate, such as the viscosity of quench oil in styrene cracking system. However, due to the complexity of the overall process, it is important to consider the entire plant as a network to identify deterioration patterns and forecast condition.

Instead to figure out the prediction from engineering perspective, we propose to leverage deep learning approach to predict the next state based on the historical information. Particularly, recurrent neural network (RNN) is selected in this paper as a basis for temporal forecasting. Considering the fact that there are multiple sub-systems running in parallel whose independence cannot be captured by a normal RNN, we design a LSTM (Long Short Term Memory) network for each sub-system and feed the outputs of LSTMs into a linear neural network layer for predicting viscosity one-hour ahead.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Sorjamaa, A., Hao, J., Reyhani, N., Ji, Y., Lendasse, A.: Methodology for long-term prediction of time series. Neurocomputing 70(1618), 2861–2869 (2007). Neural Network Applications in Electrical Engineering Selected papers from the 3rd International Work-Conference on Artificial Neural Networks (IWANN 2005)

    Google Scholar 

  2. Nian, X., Wang, Z., Feng, Q.: A hybrid algorithm based on differential evolution and group search optimization and its application on ethylene cracking furnace. Chin. J. Chem. Eng. 21(5), 537–543 (2013)

    Article  Google Scholar 

  3. Hatemi-J, A.: Multivariate tests for autocorrelation in the stable and unstable VAR models. Econ. Model. 21(4), 661–683 (2004)

    Article  Google Scholar 

  4. Jeffrey, L.: Finding structure in time. Cogn. sci. 14(2), 179–211 (1990)

    Article  Google Scholar 

  5. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  6. Lee, C.-H., Teng, C.-C.: Identification and control of dynamic systems using recurrent fuzzy neural networks. IEEE Trans. Fuzzy Syst. 8(4), 349–366 (2000)

    Article  Google Scholar 

  7. Su, H.T., McAvoy, T.J.: Long-term predictions of chemical processes using recurrent neural networks: a parallel training approach. Ind. Eng. Chem. Res. 31(5), 1338–1352 (1992)

    Article  Google Scholar 

  8. Şeker, S., Ayaz, E., Türkcan, E.: Elman’s recurrent neural network applications to condition monitoring in nuclear power plant and rotating machinery. Eng. Appl. Artif. Intell. 16(7), 647–656 (2003)

    Google Scholar 

  9. Bontempi, G., Ben Taieb, S., Borgne, Y.-A.: Machine learning strategies for time series forecasting. In: Aufaure, M.-A., Zimányi, E. (eds.) eBISS 2012. LNBIP, vol. 138, pp. 62–77. Springer, Heidelberg (2013). doi:10.1007/978-3-642-36318-4_3

    Chapter  Google Scholar 

  10. Box, G.E.P., Pierce, D.A.: Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. J. Am. Stat. Assoc. 65(332), 1509–1526 (1970)

    Google Scholar 

  11. De Gooijer, J.G., Hyndman, R.J.: 25 years of time series forecasting. Int. J. Forecast. 22(3), 443–473 (2006)

    Article  Google Scholar 

  12. Hippert, H.S., Pedreira, C.E., Souza, R.C.: Neural networks for short-term load forecasting: a review and evaluation. IEEE Trans. Power Syst. 16(1), 44–55 (2001)

    Google Scholar 

  13. Mi, L., Tan, W., Chen, R.: Multi-steps degradation process prediction for bearing based on improved back propagation neural network. In: Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, pp. 1544–1553 (2011)

    Google Scholar 

  14. Darbellay, G.A., Slama, M.: Forecasting the short-term demand for electricity: Do neural networks stand a better chance? Int. J. Forecast. 16(1), 71–83 (2000)

    Article  Google Scholar 

  15. Qi, M.: Predicting us recessions with leading indicators via neural network models. Int. J. Forecast. 17(3), 383–401 (2001)

    Article  MathSciNet  Google Scholar 

  16. Sak, H., Senior, A.W., Beaufays, F.: Long short-term memory recurrent neural network architectures for large scale acoustic modeling. In: INTERSPEECH, pp. 338–342 (2014)

    Google Scholar 

  17. Lipton, Z.C., Kale, D.C., Elkan, C., Wetzell, R.: Learning to diagnose with lstm recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)

  18. Pawłowski, K., Kurach, K.: Detecting methane outbreaks from time series data with deep neural networks. In: Yao, Y., Hu, Q., Yu, H., Grzymala-Busse, J.W. (eds.) RSFDGrC 2015. LNCS (LNAI), vol. 9437, pp. 475–484. Springer, Heidelberg (2015). doi:10.1007/978-3-319-25783-9_42

    Chapter  Google Scholar 

  19. Ma, X., Tao, Z., Wang, Y., Haiyang, Y., Wang, Y.: Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C: Emerg. Technol. 54, 187–197 (2015)

    Article  Google Scholar 

  20. Xiong, R., Nicholas, E.P., Shen, Y.: Deep learning stock volatilities with google domestic trends. arXiv preprint arXiv:1512.04916 (2015)

  21. Liu, H., Dougherty, E., Dy, J.G., Torkkola, K., Tuv, E., Peng, H., Ding, C., Long, F., Berens, M., Parsons, L., et al.: Evolving feature selection. IEEE Intell. Syst. 20(6), 64–76 (2005)

    Article  Google Scholar 

  22. Pedišić, L., Matijević, B., Perić, B.: Influence of quenching oils composition on the cooling rate. In: 1st International Conference on Heat Treatment and Surface Engineering of Tools and Dies (2005)

    Google Scholar 

  23. Greff, K., Srivastava, R.K., Koutník, J., Steunebrink, B.R., Schmidhuber, J.: Lstm: a search space odyssey. arXiv preprint arXiv:1503.04069 (2015)

  24. Bousquet, O., Bottou, L.: The tradeoffs of large scale learning. In: Advances in Neural Information Processing Systems, pp. 161–168 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tianyou Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Shao, D., Zhang, T., Mannar, K., Han, Y. (2016). Time Series Forecasting on Engineering Systems Using Recurrent Neural Networks. In: Li, J., Li, X., Wang, S., Li, J., Sheng, Q. (eds) Advanced Data Mining and Applications. ADMA 2016. Lecture Notes in Computer Science(), vol 10086. Springer, Cham. https://doi.org/10.1007/978-3-319-49586-6_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-49586-6_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-49585-9

  • Online ISBN: 978-3-319-49586-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics