Forecasting the daily electricity consumption in the Moscow region using artificial neural networks
In  we demonstrated the possibility in principle for short-term forecasting of daily volumes of passenger traffic in the Moscow metro with the help of artificial neural networks. During training and predicting, a set of the factors that affect the daily passenger traffic in the subway is passed to the input of the neural network. One of these factors is the daily power consumption in the Moscow region. Therefore, to predict the volume of the passenger traffic in the subway, we must first to solve the problem of forecasting the daily energy consumption in the Moscow region.
Unable to display preview. Download preview PDF.
- 1.V. V. Ivanov and E. S. Osetrov, “Forecasting of passenger traffic in Moscow metro applying artificial neural networks,” Vestn. MIFI 5, 65–74 (2016).Google Scholar
- 2.V. V. Ivanov and E. S. Osetrov, “Forecasting of the Moscow metro passenger traffic applying artificial neural networks with preliminary filtering analyzed data,” Vestn. MIFI 5, 162–169 (2016).Google Scholar
- 4.B. Denby, “Tutorial on neural networks applications in high energy physics: 1982 perspective,” in New Computing Techniques in Physics Research II, Proceedings of the 2nd International Workshop on Software Engineering, Artifical Intelligence and Expert System in High Energy Physics, La Londe-les-Maures, France, Jan. 13–18, 1992, Ed. by D. Perret-Gallix (World Scientific, 1992), p. 287.Google Scholar
- 5.S. F. Fogelman, “Neural networks for patterns recognition: introduction and comparison to other techniques,” in New Computing Techniques in Physics Research II, Proceedings of the 2nd International Workshop on Software Engineering, Artifical Intelligence and Expert System in High Energy Physics, La Londe-les- Maures, France, Jan. 13–18, 1992, Ed. by D. Perret-Gallix (World Scientific, 1992), p. 277.Google Scholar
- 6.D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1: Foundations, Ed. by D. E. Rumelhart and J. L. McClelland (MIT Press, Cambridge, MA, 1986).Google Scholar
- 7.N. E. Golyandina, V. V. Nekrutkin, and K. A. Braulov, Caterpillar-SSA Method: Time Series Analysis. Gistat Group. http://www.gistatgroup.com/gus/.Google Scholar
- 10.A. Hoecker, P. Speckmayer, J. Stelzer, J. Therhaag, E. von Toerne, and H. Voss, “TMVA 4.2.0–toolkit for multivariate data analysis with ROOT,” arXiv:physics/ 0703039; TMVA version 4.2.0, CERN-OPEN-2007-007 (2013). http://tmva.sourceforge.net.Google Scholar
- 12.S. Lahmiri, “A comparative study of backpropagation alogorithms in financial prediction,” Int. J. Comput. Sci., Eng. Appl. 1 (4), 15–21 (2011).Google Scholar
- 16.I. Daubechies, Wavelets (SIAM, Philadelphia, 1992).Google Scholar
- 22.F. James and M. Roos, “MINUIT–function minimization and error analysis,” CERN Program Library D506 (CERN, 1988).Google Scholar
- 23.R. Brun, O. Couet, C. Vandoni, and P. Zanarini, “PAW–physics analysis workstation,” CERN Program Library Q121 (CERN, 1989).Google Scholar
- 24.D. L. Danilov and A. A. Zhiglyavskii, Main Components of Time Series: Caterpillar Method (SPb. Gos. Univ., St. Petersburg, 1997) [in Russian].Google Scholar
- 26.www.gistatgroup.com/cat/.Google Scholar