Skip to main content

Evolving Recurrent Neural Networks for Time Series Data Prediction of Coal Plant Parameters

  • Conference paper
  • First Online:
Applications of Evolutionary Computation (EvoApplications 2019)

Abstract

This paper presents the Evolutionary eXploration of Augmenting LSTM Topologies (EXALT) algorithm and its use in evolving recurrent neural networks (RNNs) for time series data prediction. It introduces a new open data set from a coal-fired power plant, consisting of 10 days of per minute sensor recordings from 12 different burners at the plant. This large scale real world data set involves complex dependencies between sensor parameters and makes for challenging data to predict. EXALT provides interesting new techniques for evolving neural networks, including epigenetic weight initialization, where child neural networks re-use parental weights as a starting point to backpropagation, as well as node-level mutation operations which can improve evolutionary progress. EXALT has been designed with parallel computation in mind to further improve performance. Preliminary results were gathered predicting the Main Flame Intensity data parameter, with EXALT strongly outperforming five traditional neural network architectures on the best, average and worst cases across 10 repeated training runs per test case; and was only slightly behind the best trained Elman recurrent neural networks while being significantly more reliable (i.e., much better average and worst case results). Further, EXALT achieved these results 2 to 10 times faster than the traditional methods, in part due to its scalability, showing strong potential to beat traditional architectures given additional runtime.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/travisdesell/exact.

References

  1. Stanley, K., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  2. Desell, T.: Developing a volunteer computing project to evolve convolutional neural networks and their hyperparameters. In: The 13th IEEE International Conference on eScience (eScience 2017), pp. 19–28, October 2017

    Google Scholar 

  3. Desell, T.: Large scale evolution of convolutional neural networks using volunteer computing. CoRR abs/1703.05422 (2017). http://arxiv.org/abs/1703.05422

  4. Hochrieter, S., Schmidhuber, J.: Long short term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  5. Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3(Aug), 115–143 (2002)

    MathSciNet  MATH  Google Scholar 

  6. Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)

    Article  Google Scholar 

  7. Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. Neural Comput. 12(10), 2451–2471 (2000)

    Article  Google Scholar 

  8. Donahue, J., et al.: Long-term recurrent convolutional networks for visual recognition and description. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2625–2634 (2015)

    Google Scholar 

  9. Chao, L., Tao, J., Yang, M., Li, Y., Wen, Z.: Audio visual emotion recognition with temporal alignment and perception attention. arXiv preprint arXiv:1603.08321 (2016)

  10. Eck, D., Schmidhuber, J.: A first look at music composition using lstm recurrent neural networks. Istituto Dalle Molle Di Studi Sull Intelligenza Artificiale 103, (2002)

    Google Scholar 

  11. Di Persio, L., Honchar, O.: Artificial neural networks approach to the forecast of stock market price movements. Int. J. Econ. Manag. Syst. 1, (2016)

    Google Scholar 

  12. Maknickienė, N., Maknickas, A.: Application of neural network for forecasting of exchange rates and forex trading. In: The 7th international scientific conference Business and Management, pp. 10–11 (2012)

    Google Scholar 

  13. Felder, M., Kaifel, A., Graves, A.: Wind power prediction using mixture density recurrent neural networks. In: Poster Presentation gehalten auf der European Wind Energy Conference (2010)

    Google Scholar 

  14. Choi, E., Bahadori, M.T., Sun, J.: Doctor AI: Predicting clinical events via recurrent neural networks. arXiv preprint arXiv:1511.05942 (2015)

  15. Desell, T., Clachar, S., Higgins, J., Wild, B.: Evolving deep recurrent neural networks using ant colony optimization. In: Ochoa, G., Chicano, F. (eds.) EvoCOP 2015. LNCS, vol. 9026, pp. 86–98. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-16468-7_8

    Chapter  Google Scholar 

  16. ElSaid, A., El Jamiy, F., Higgins, J., Wild, B., Desell, T.: Optimizing long short-term memory recurrent neural networks using ant colony optimization to predict turbine engine vibration. Appl. Soft Comput. 73, 969–991 (2018)

    Article  Google Scholar 

  17. ElSaid, A., Jamiy, F.E., Higgins, J., Wild, B., Desell, T.: Using ant colony optimization to optimize long short-term memory recurrent neural networks. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 13–20. ACM (2018)

    Google Scholar 

  18. Annunziato, M., Lucchetti, M., Pizzuti, S.: Adaptive systems and evolutionary neural networks: a survey. In: Proceedings of EUNITE02, Albufeira, Portugal (2002)

    Google Scholar 

  19. Larochelle, H., Bengio, Y., Louradour, J., Lamblin, P.: Exploring strategies for training deep neural networks. J. Mach. Learn. Res. 10(Jan), 1–40 (2009)

    MATH  Google Scholar 

  20. Kandel, E.R., Schwartz, J.H., Jessell, T.M., Siegelbaum, S.A., Hudspeth, A.J.: Principles of Neural Science, vol. 4. McGraw-hill, New York (2000)

    Google Scholar 

  21. Rawal, A., Miikkulainen, R.: From nodes to networks: Evolving recurrent neural networks. CoRR abs/1803.04439 (2018). http://arxiv.org/abs/1803.04439

  22. Rawal, A., Miikkulainen, R.: Evolving deep LSTM-based memory networks using an information maximization objective. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, pp. 501–508. ACM (2016)

    Google Scholar 

  23. Desell, T.: Asynchronous Global Optimization for Massive Scale Computing. Ph.D. thesis, Rensselaer Polytechnic Institute (2009)

    Google Scholar 

  24. Message Passing Interface Forum: MPI: A message-passing interface standard. The International Journal of Supercomputer Applications and High Performance Computing 8(3/4), 159–416 (Fall/Winter 1994)

    Google Scholar 

  25. Pascanu, R., Mikolov, T., Bengio, Y.: On the difficulty of training recurrent neural networks. In: International Conference on Machine Learning, pp. 1310–1318 (2013)

    Google Scholar 

  26. Jozefowicz, R., Zaremba, W., Sutskever, I.: An empirical exploration of recurrent network architectures. In: International Conference on Machine Learning, pp. 2342–2350 (2015)

    Google Scholar 

  27. Alba, E., Tomassini, M.: Parallelism and evolutionary algorithms. IEEE Trans. Evol. Comput. 6(5), 443–462 (2002)

    Article  Google Scholar 

Download references

Acknowledgements

This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Advanced Combustion Systems under Award Number #FE0031547.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Travis Desell .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

ElSaid, A., Benson, S., Patwardhan, S., Stadem, D., Desell, T. (2019). Evolving Recurrent Neural Networks for Time Series Data Prediction of Coal Plant Parameters. In: Kaufmann, P., Castillo, P. (eds) Applications of Evolutionary Computation. EvoApplications 2019. Lecture Notes in Computer Science(), vol 11454. Springer, Cham. https://doi.org/10.1007/978-3-030-16692-2_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-16692-2_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-16691-5

  • Online ISBN: 978-3-030-16692-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics