Natural gas is the cleanest fossil fuel since it emits the lowest amount of other remains after being burned. Over the years, natural gas usage has increased significantly. Accurate forecasting is crucial for maintaining gas supplies, transportation and network stability. This paper presents two methodologies to identify the optimal configuration o parameters of a Neural Network (NN) to forecast the next 24 h of gas flow for each node of a large gas network.
In particular the first one applies a Design Of Experiments (DOE) to obtain a quick initial solution. An orthogonal design, consisting of 18 experiments selected among a total of 4.374 combinations of seven parameters (training algorithm, transfer function, regularization, learning rate, lags, and epochs), is used. The best result is selected as initial solution of an extended experiment for which the Simulated Annealing is run to find the optimal design among 89.100 possible combinations of parameters.
The second technique is based on the application of Genetic Algorithm for the selection of the optimal parameters of a recurrent neural network for time series forecast. GA was applied with binary representation of potential solutions, where subsets of bits in the bit string represent different values for several parameters of the recurrent neural network.
We tested these methods on three municipal nodes, using one year and half of hourly gas flow to train the network and 60 days for testing. Our results clearly show that the presented methodologies bring promising results in terms of optimal configuration of parameters and forecast error.
Machine learning Neural networks Genetic algorithm Simulated annealing Design Of Experiments (DOE) Time series forecast
This is a preview of subscription content, log in to check access.
The authors would like to acknowledge networking support by the COST Action TD1207.
Marx, B.: Fitting a continuous profile to hourly natural gas flow data, Ph.D. thesis, Marquette University Department of Electrical and Computer Engineering, Milwaukee, WI (2007)Google Scholar
Lim, H.L., Brown, R.H.: Hourly gas load forecasting model input factor identification using a genetic algorithm. In: Proceedings of 44th IEEE Midwest Symposium on Circuits and Systems, Dayton, OH, pp. 670–673 (2001)Google Scholar
Zhang, G., Patuwo, B.E., Hu, M.Y.: Forecasting with artificial neural networks: the state of the art. Int. J. Forecast. 14, 35–62 (1998)CrossRefGoogle Scholar
Cai, X., Zhang, N., Venayagamoorthy, G.K., Wunsch, D.C.: Time series prediction with recurrent neural networks trained by a hybrid PSO-EA algorithm. Neurocomputing 70(13), 2342–2353 (2007)CrossRefGoogle Scholar
Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Netw. 5(2), 240–254 (1994)CrossRefGoogle Scholar
Tsai, J.-T., Chou, J.-H., Liu, T.-K.: Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Trans. Neural Netw. 17(1), 69–80 (2006)CrossRefGoogle Scholar
Leung, Y.W., Wang, Y.: An orthogonal genetic algorithm with quantization for global numerical optimization. IEEE Trans. Evol. Comput. 5(1), 41–53 (2001)CrossRefGoogle Scholar
Koza, R., Rice, J.P.: Genetic generation of both the weights and architecture for a neural network. In: Proceedings of International Joint Conference on Neural Networks, Seattle, WA, pp. 397–404 (1991)Google Scholar
Richards, N., Moriarty, D.E., Miikkulainen, R.: Evolving neural networks to play go. Appl. Intell. 8, 85–96 (1997)CrossRefGoogle Scholar
Curran, D., O’Riordan, C.: Applying Evolutionary Computation to Designing Neural Networks: A Study of the State of the Art, Department of Information Technology, National University of Ireland, Galway, Ireland, Technical report NUIG-IT-111 002 (2002)Google Scholar
Kotthoff, L., Thornton, C., Hoos, H., Hutter, F., Leyton-Brown, K.: Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA. J. Mach. Learn. Res. 18(25), 1–5 (2017)MathSciNetzbMATHGoogle Scholar
Eibe, F., Hall, M.A., Witten, I.H.: The WEKA Workbench. Online Appendix for Data Mining: Practical Machine Learning Tools and Techniques, 4th edn. Morgan Kaufmann, San Francisco (2016)Google Scholar
López-Ibáñez, M., Dubois-Lacoste, J., Pérez Cáceres, L., Stützle, T., Birattari, M.: The irace package: iterated racing for automatic algorithm configuration. Oper. Res. Perspect. 3, 43–58 (2016)MathSciNetCrossRefGoogle Scholar
Hutter, F., Holger, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. J. Artif. Res. 36, 267–306 (2009)zbMATHGoogle Scholar
Adenso-Díaz, B., Laguna, M.: Fine-tuning of algorithms using fractional experimental designs and local search. Oper. Res. 54(1), 99–114 (2006)CrossRefGoogle Scholar
Cortez, P., Rocha, M., Neves, J.: Time series forecasting by evolutionary neural networks. In: Rubuñal, J., Dorado, J. (eds.) Artificial Neural Networks in Real-Life Applications, Chapter III, pp. 47–70. Idea Group Publishing, Hershey (2006)Google Scholar
Niska, H., Hiltunena, T., Karppinenb, A., Ruuskanena, J., Kolehmainena, M.: Evolving the neural network model for forecasting air pollution time series. Eng. Appl. Artif. Intell. 17(2), 159–167 (2004)CrossRefGoogle Scholar
Chena, Y.-H., Chang, F.-J.: Evolutionary artificial neural networks for hydrological systems forecasting. J. Hydrol. 367(1–2), 125–137 (2009)CrossRefGoogle Scholar
Taguchi, G., Chowdhury, S., Taguchi, S.: Robust Engineering. McGraw Hill, New York (1999)zbMATHGoogle Scholar
Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation and active learning. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Proceedings of the 7th International Conference on Neural Information Processing Systems (NIPS 1994), pp. 231–238. MIT Press, Cambridge (1994)Google Scholar