Skip to main content

Forecasting Natural Gas Flows in Large Networks

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Big Data (MOD 2017)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10710))

Included in the following conference series:

Abstract

Natural gas is the cleanest fossil fuel since it emits the lowest amount of other remains after being burned. Over the years, natural gas usage has increased significantly. Accurate forecasting is crucial for maintaining gas supplies, transportation and network stability. This paper presents two methodologies to identify the optimal configuration o parameters of a Neural Network (NN) to forecast the next 24 h of gas flow for each node of a large gas network.

In particular the first one applies a Design Of Experiments (DOE) to obtain a quick initial solution. An orthogonal design, consisting of 18 experiments selected among a total of 4.374 combinations of seven parameters (training algorithm, transfer function, regularization, learning rate, lags, and epochs), is used. The best result is selected as initial solution of an extended experiment for which the Simulated Annealing is run to find the optimal design among 89.100 possible combinations of parameters.

The second technique is based on the application of Genetic Algorithm for the selection of the optimal parameters of a recurrent neural network for time series forecast. GA was applied with binary representation of potential solutions, where subsets of bits in the bit string represent different values for several parameters of the recurrent neural network.

We tested these methods on three municipal nodes, using one year and half of hourly gas flow to train the network and 60 days for testing. Our results clearly show that the presented methodologies bring promising results in terms of optimal configuration of parameters and forecast error.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Marx, B.: Fitting a continuous profile to hourly natural gas flow data, Ph.D. thesis, Marquette University Department of Electrical and Computer Engineering, Milwaukee, WI (2007)

    Google Scholar 

  2. Lyness, F.K.: Gas demand forecasting. Statistician 33(1), 9–21 (1984)

    Article  Google Scholar 

  3. Lim, H.L., Brown, R.H.: Hourly gas load forecasting model input factor identification using a genetic algorithm. In: Proceedings of 44th IEEE Midwest Symposium on Circuits and Systems, Dayton, OH, pp. 670–673 (2001)

    Google Scholar 

  4. Zhang, G., Patuwo, B.E., Hu, M.Y.: Forecasting with artificial neural networks: the state of the art. Int. J. Forecast. 14, 35–62 (1998)

    Article  Google Scholar 

  5. Cai, X., Zhang, N., Venayagamoorthy, G.K., Wunsch, D.C.: Time series prediction with recurrent neural networks trained by a hybrid PSO-EA algorithm. Neurocomputing 70(13), 2342–2353 (2007)

    Article  Google Scholar 

  6. Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Netw. 5(2), 240–254 (1994)

    Article  Google Scholar 

  7. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  8. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40

    Chapter  Google Scholar 

  9. Tsai, J.-T., Chou, J.-H., Liu, T.-K.: Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Trans. Neural Netw. 17(1), 69–80 (2006)

    Article  Google Scholar 

  10. Leung, Y.W., Wang, Y.: An orthogonal genetic algorithm with quantization for global numerical optimization. IEEE Trans. Evol. Comput. 5(1), 41–53 (2001)

    Article  Google Scholar 

  11. Yao, X.: Evolving artificial networks. Proc. IEEE 87, 1423–1447 (1999)

    Article  Google Scholar 

  12. Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1), 239–263 (2002)

    Article  MathSciNet  Google Scholar 

  13. Koza, R., Rice, J.P.: Genetic generation of both the weights and architecture for a neural network. In: Proceedings of International Joint Conference on Neural Networks, Seattle, WA, pp. 397–404 (1991)

    Google Scholar 

  14. Richards, N., Moriarty, D.E., Miikkulainen, R.: Evolving neural networks to play go. Appl. Intell. 8, 85–96 (1997)

    Article  Google Scholar 

  15. Curran, D., O’Riordan, C.: Applying Evolutionary Computation to Designing Neural Networks: A Study of the State of the Art, Department of Information Technology, National University of Ireland, Galway, Ireland, Technical report NUIG-IT-111 002 (2002)

    Google Scholar 

  16. Kotthoff, L., Thornton, C., Hoos, H., Hutter, F., Leyton-Brown, K.: Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA. J. Mach. Learn. Res. 18(25), 1–5 (2017)

    MathSciNet  MATH  Google Scholar 

  17. Eibe, F., Hall, M.A., Witten, I.H.: The WEKA Workbench. Online Appendix for Data Mining: Practical Machine Learning Tools and Techniques, 4th edn. Morgan Kaufmann, San Francisco (2016)

    Google Scholar 

  18. López-Ibáñez, M., Dubois-Lacoste, J., Pérez Cáceres, L., Stützle, T., Birattari, M.: The irace package: iterated racing for automatic algorithm configuration. Oper. Res. Perspect. 3, 43–58 (2016)

    Article  MathSciNet  Google Scholar 

  19. Hutter, F., Holger, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. J. Artif. Res. 36, 267–306 (2009)

    MATH  Google Scholar 

  20. Adenso-Díaz, B., Laguna, M.: Fine-tuning of algorithms using fractional experimental designs and local search. Oper. Res. 54(1), 99–114 (2006)

    Article  Google Scholar 

  21. Cortez, P., Rocha, M., Neves, J.: Time series forecasting by evolutionary neural networks. In: Rubuñal, J., Dorado, J. (eds.) Artificial Neural Networks in Real-Life Applications, Chapter III, pp. 47–70. Idea Group Publishing, Hershey (2006)

    Google Scholar 

  22. Niska, H., Hiltunena, T., Karppinenb, A., Ruuskanena, J., Kolehmainena, M.: Evolving the neural network model for forecasting air pollution time series. Eng. Appl. Artif. Intell. 17(2), 159–167 (2004)

    Article  Google Scholar 

  23. Chena, Y.-H., Chang, F.-J.: Evolutionary artificial neural networks for hydrological systems forecasting. J. Hydrol. 367(1–2), 125–137 (2009)

    Article  Google Scholar 

  24. Taguchi, G., Chowdhury, S., Taguchi, S.: Robust Engineering. McGraw Hill, New York (1999)

    MATH  Google Scholar 

  25. Bose, R.C., Bush, K.A.: Orthogonal arrays of strength two and three. Ann. Math. Stat. 23, 508–524 (1952)

    Article  MathSciNet  Google Scholar 

  26. Khaw, J.F.C., Lim, B.S., Lim, L.E.N.: Optimal design of neural networks using the Taguchi method. Neurocomputing 7(3), 225–245 (1995)

    Article  Google Scholar 

  27. Kackar, R.N., Lagergren, E.S., Fillben, J.J.: Taguchi’s orthogonal arrays are classical designs of experiments. J. Res. Natl. Inst. Stand. Technol. 96, 577–591 (1991)

    Article  Google Scholar 

  28. Montgomery, D.C.: Experimental design for product and process design and development. J. R. Stat. Soc. Ser. D (The Statistician) 48(2), 159–177 (1999)

    Article  Google Scholar 

  29. Gunst, R.F., Mason, R.L.: Fractional factorial design. WIREs Comput. Stat. 1, 234–244 (2009)

    Article  Google Scholar 

  30. Sanger, T.D.: Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Netw. 2, 459–473 (1989)

    Article  Google Scholar 

  31. Møller, M.F.: A scaled conjugate gradient algorithm for fast supervised learning. Neural Netw. 6(4), 525–533 (1993)

    Article  Google Scholar 

  32. Yu, H., Wilamowski, B.M.: Levenberg-Marquardt Training, in Industrial Electronics Handbook. 5 Intelligent Systems. CRC Press, Boca Raton (2011). 12-1–12-15

    Google Scholar 

  33. Fletcher, R.: Practical Methods of Optimization, 2nd edn. Wiley, New York (1987)

    MATH  Google Scholar 

  34. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)

    Article  MathSciNet  Google Scholar 

  35. Elman, J.L.: Finding structure in time. Cogn. Sci. 14, 179–211 (1990)

    Article  Google Scholar 

  36. Holland, J.H.: Adaptation in Natural and Artificial Systems. MIT Press, Cambridge (1992)

    Book  Google Scholar 

  37. Haupt, R., Haupt, S.E.: Practical Genetic Algorithms. Wiley, New York (1998)

    MATH  Google Scholar 

  38. Michalewicz, Z.: Genetic Algorithms \(+\) Data Structures = Evolution Programming, 3rd edn. Springer, Berlin (1999)

    MATH  Google Scholar 

  39. Roy, R.K.: A Primer on Taguchi Method. Van Nostrand Reinhold, New York (1990)

    MATH  Google Scholar 

  40. Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation and active learning. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Proceedings of the 7th International Conference on Neural Information Processing Systems (NIPS 1994), pp. 231–238. MIT Press, Cambridge (1994)

    Google Scholar 

Download references

Acknowledgments

The authors would like to acknowledge networking support by the COST Action TD1207.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Natalia Selini Hadjidimitriou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dell’Amico, M., Hadjidimitriou, N.S., Koch, T., Petkovic, M. (2018). Forecasting Natural Gas Flows in Large Networks. In: Nicosia, G., Pardalos, P., Giuffrida, G., Umeton, R. (eds) Machine Learning, Optimization, and Big Data. MOD 2017. Lecture Notes in Computer Science(), vol 10710. Springer, Cham. https://doi.org/10.1007/978-3-319-72926-8_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-72926-8_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-72925-1

  • Online ISBN: 978-3-319-72926-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics