Skip to main content

Energy Demand Forecasting Using Deep Learning

  • Chapter
  • First Online:
Smart Cities Performability, Cognition, & Security

Abstract

Our cities face non-stop growth in population and infrastructures and require more energy every day. Energy management is the key success for the smart cities concept since electricity is one of the essential resources which has no alternatives. The basic role of the smart energy concept is to optimize the consumption and demand in a smart way in order to decrease the energy costs and increase efficiency. Among the variety of benefits, the smart energy concept mainly enhances the quality of life of the inhabitants of the cities as well as making the environment cleaner. One of the approaches for the smart energy concept is to develop prediction models using machine learning, ML algorithms in order to forecast energy demand, especially for daily and weekly periods. The upcoming chapter describes thoroughly what is behind the deep learning concept as a subset of ML and how neural networks can be applied for developing energy prediction models. A specialized version of the recurrent neural network (RNN), e.g., long short-term memory (LSTM), is described in detail. In addition, the chapter tries to answer the question as to why the LSTM is a state-of-the-art ML algorithm in time series modeling today. To this end, we introduce ANNdotNET, which provides a user-friendly ML framework with capability of importing data from the smart grids of a smart city. By design, the ANNdotNET is a cloud solution which can be connected by other Internet of Things, IoT, devices for data collecting, feeding, and providing efficient models to energy managers in a bigger smart city cloud solution. As an example, the chapter provides the evolution of daily and weekly energy demand models for Nicosia, the capital of Northern Cyprus. Currently, energy demand predictions for the city are not as efficient as expected. Therefore, the results of this chapter can be used as efficient alternatives for IoT-based energy prediction models in any smart city.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Al-Turjman, F., & Alturjman, S. (2018). Context-sensitive access in industrial internet of things (IIoT) healthcare applications. IEEE Transactions on Industrial Informatics, 14(6), 2736–2744. https://doi.org/10.1109/TII.2018.2808190

    Article  Google Scholar 

  2. Bengio, Y., Simard, P., & Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks, 5(2), 157–166. https://doi.org/10.1109/72.279181

    Article  Google Scholar 

  3. Bonissone, P. P. (2015). Springer handbook of computational intelligence. https://doi.org/10.1007/978-3-662-43505-2

    Google Scholar 

  4. Cao, Q., Ewing, B.T., & Thompson, M.A. (2012). Forecasting wind speed with recurrent neural networks. European Journal of Operational Research, 221(1), 148–154. https://doi.org/10.1016/j.ejor.2012.02.042

    Article  MathSciNet  Google Scholar 

  5. Cleveland, R. B., Cleveland, W. S., McRae, J. E., & Terpenning, I. (1990). STL: A seasonal-trend decomposition procedure based on loess. Journal of Official Statistics. https://doi.org/citeulike-article-id:1435502

  6. Danandeh Mehr, A. (2018). An improved gene expression programming model for streamflow forecasting in intermittent streams. Journal of Hydrology, 563, 669–678.

    Article  Google Scholar 

  7. Dokumentov, A., & Hyndman, R. J. (2015). STR: A seasonal-trend decomposition procedure based on regression, Department of Econometrics and Business Statistics, Monash University.

    Google Scholar 

  8. Gers, F. A., Schraudolph, N. N., & Schmidhuber, J. (2002). Learning precise timing with LSTM recurrent networks. Journal of Machine Learning Research, 3(1), 115–143. https://doi.org/10.1162/153244303768966139

    MathSciNet  MATH  Google Scholar 

  9. Graves, A., Mohamed, A., & Hinton, G. (2013). Speech recognition with deep recurrent neural networks. In IEEE International Conference on Acoustics, Speech and Signal Processing (Vol. 3, pp. 6645–6649). https://doi.org/10.1109/ICASSP.2013.6638947

  10. Hermans, M., & Schrauwen, B. (2013). Training and analyzing deep recurrent neural networks. NIPS 2013.

    Google Scholar 

  11. Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  12. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences, 79(8), 2554–2558. https://doi.org/10.1073/pnas.79.8.2554

    Article  MathSciNet  Google Scholar 

  13. Hoptroff, R. G. (1993). The principles and practice of time series forecasting and business modelling using neural nets. Neural Computing Applications, 1(1), 59–66. https://doi.org/10.1007/BF01411375

    Article  Google Scholar 

  14. Hrnjica, B. (2018). ANNdotNET- deep learning tool on .Net platform https://doi.org/10.5281/ZENODO.1756095

  15. Hrnjica, B., & Danandeh Mehr, A. (2018). Optimized genetic programming applications. IGI Global. https://doi.org/10.4018/978-1-5225-6005-0

  16. Hyndman, R. J., & Athanasopoulos, G. (2018). Forecasting: Principles and practice (2nd ed.). Melbourne: OTexts. http://OTexts.com/fpp2. Accessed 1 Feb 2019.

    Google Scholar 

  17. Kaastra, I., & Boyd, M. (1996). Designing a neural network for forecasting financial and economic time series. Neurocomputing, 10(3), 215–236. https://doi.org/10.1016/0925-2312(95)00039-9

    Article  Google Scholar 

  18. Lee, T. L. (2008). Back-propagation neural network for the prediction of the short-term storm surge in Taichung harbor, Taiwan. Engineering Applications of Artificial Intelligence, 21(1), 63–72. https://doi.org/10.1016/j.engappai.2007.03.002

    Article  MathSciNet  Google Scholar 

  19. Lu, Y., & Salem, F. M. (2017). Simplified gating in long short-term memory (LSTM) recurrent neural networks. CoRR, abs/1701.0, 5. https://doi.org/10.1109/MWSCAS.2017.8053244

  20. Mehrotra, K., Mohan, C. K., & Ranka, S. (1997). Elements of artificial neural networks, A Bradford Book (The MIT Press, Cambridge)

    Google Scholar 

  21. Microsoft. (2015). Microsoft azure. https://doi.org/10.1007/978-1-4842-1043-7

    Book  Google Scholar 

  22. Muhammad, K., Ahmad, J., Lv, Z., Bellavista, P., Yang, P., & Baik, S. W. (2018). Efficient deep CNN-based fire detection and localization in video surveillance applications. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 1–16. https://doi.org/10.1109/TSMC.2018.2830099

    Article  Google Scholar 

  23. Muhammad, K., Hussain, T., & Baik, S. W. (2018). Efficient CNN based summarization of surveillance videos for resource-constrained devices. Pattern Recognition Letters. https://doi.org/10.1016/J.PATREC.2018.08.003

  24. Pineda, F. J. (1987). Generalization of back-propagation to recurrent neural networks. Physical Review Letters, 59(19), 2229–2232. https://doi.org/10.1103/PhysRevLett.59.2229

    Article  MathSciNet  Google Scholar 

  25. Rodriguez, C. P., & Anders, G. J. (2004). Energy price forecasting in the Ontario competitive power system market. IEEE Transactions on Power Systems, 19(1), 366–374. https://doi.org/10.1109/TPWRS.2003.821470

    Article  Google Scholar 

  26. Rosenblatt, F. (1960). Perceptron simulation experiments. Proceedings of the IRE, 48(3), 301–309. https://doi.org/10.1109/JRPROC.1960.287598

    Article  MathSciNet  Google Scholar 

  27. Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning representations by back-propagating errors. Nature, 323(6088), 533–536. https://doi.org/10.1038/323533a0

    Article  Google Scholar 

  28. Russell, S., & Norvig, P. (2015). Artificial intelligence a modern approach (3rd edn.). London: Pearson Education.

    MATH  Google Scholar 

  29. Sak, H., Senior, A., & Beaufays, F. (2014). Long short-term memory recurrent neural network architectures for large scale acoustic modeling. Interspeech 2014, (September), pp. 338–342. https://doi.org/arXiv:1402.1128

    Google Scholar 

  30. Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural Networks, 61, 85–117. https://doi.org/10.1016/j.neunet.2014.09.003

    Article  Google Scholar 

  31. Taylor, S. J., & Letham, B. (2018). Forecasting at Scale. American Statistician, 72, 37–45. https://doi.org/10.1080/00031305.2017.1380080

    Article  MathSciNet  Google Scholar 

  32. Ullah, A., Muhammad, K., Del Ser, J., Baik, S. W., & Albuquerque, V. (2018). Activity recognition using temporal optical flow convolutional features and multi-layer LSTM. IEEE Transactions on Industrial Electronics. https://doi.org/10.1109/TIE.2018.2881943

    Article  Google Scholar 

  33. Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y., & Manzagol, P.-A. (2010). Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. Journal of Machine Learning Research. https://doi.org/10.1111/1467-8535.00290

    Article  Google Scholar 

  34. Yu, D., Eversole, A., Seltzer, M., Yao, K., Kuchaiev, O., Zhang, et al. (2014). An introduction to computational networks and the computational network toolkit. Microsoft Research.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bahrudin Hrnjica .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Hrnjica, B., Mehr, A.D. (2020). Energy Demand Forecasting Using Deep Learning. In: Al-Turjman, F. (eds) Smart Cities Performability, Cognition, & Security. EAI/Springer Innovations in Communication and Computing. Springer, Cham. https://doi.org/10.1007/978-3-030-14718-1_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-14718-1_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-14717-4

  • Online ISBN: 978-3-030-14718-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics