Skip to main content

Time Series Decomposition for Improving the Forecasting Performance of Convolutional Neural Networks

  • Conference paper
  • First Online:
Advances in Artificial Intelligence (CAEPIA 2018)

Abstract

Time Series forecasting is of high interest in the Big Data ecosystem. A larger data volume accessible in industry and science, and a higher profit from more accurate predictions have generated a growing application of Deep Learning techniques in the Time Series forecasting. In this work, the improvement of the forecasting capacity of Convolutional Neural Networks and Recurrent Neural Networks when using as input the trend, seasonal and remainder time series generated by the Seasonal and Trend decomposition using Loess, instead of the original time series observations, is evaluated. The benchmark used in this work is composed of eight seasonal time series with different lengths and origins. Besides, Convolutional Neural Networks and Recurrent Neural Networks, comparisons with Multilayer Perceptrons are also undertaken. As a consequence, an improvement in the forecasting capacity when replacing the original observations by their decomposition in Convolutional Neural Networks-based forecasting is stated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Regarding the previous efforts in the analysis of the \(^{222}Rn\) level at Canfranc Underground Laboratory (LSC), in [20] the initial efforts for modelling this time series using classical and deep-learning-based approaches are shown. In this paper, the times series is modelled using CNN, being the main focus on the monthly forecasting capacity for scheduling maintenance operations of the experiment hosted at LSC. The LSC is composed of diverse halls for hosting scientific experiments with requirements of very low-background. The two main halls, Hall A and Hall B —which are contiguous—, have instruments for measuring the level of \(^{222}Rn\).

  2. 2.

    In [1] an assessment of the relationship between the \(SO_2\) and the total suspended particulate levels and mortality in Madrid during the period 1986–1992 is presented. In this study the time series analysis is based on multivariate autoregressive integrated moving-average (ARIMA). Other publications relating Madrid acoustic or air pollution with population health can be found in [8, 18].

References

  1. Alberdi Odriozola, J.C., Díaz Jiménez, J., Montero Rubio, J.C., Mirón Pérez, I.J., Pajares Ortíz, M.S., Ribera Rodrigues, P.: Air pollution and mortality in Madrid, Spain: a time-series analysis. Int. Arch. Occup. Environ. Health 71(8), 543–549 (1998). https://doi.org/10.1007/s004200050321

    Article  Google Scholar 

  2. BeerAU Time Series. https://datamarket.com/data/set/22xr/monthly-beer-production-in-australia-megalitres-includes-ale-and-stout-does-not-include-beverages-with-alcohol-percentage-less-than-115-jan-1956-aug-1995

  3. Births Time Series. https://datamarket.com/data/set/22nv/monthly-new-york-city-births-unknown-scale-jan-1946-dec-1959

  4. Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press Inc., New York (1995)

    MATH  Google Scholar 

  5. Chniti, G., Bakir, H., Zaher, H.: E-commerce time series forecasting using LSTM neural network and support vector regression. In: Proceedings of the International Conference on Big Data and Internet of Thing, BDIOT 2017, pp. 80–84. ACM, New York (2017). https://doi.org/10.1145/3175684.3175695

  6. Chollet, F., et al.: Keras (2015). https://github.com/fchollet/keras

  7. Cleveland, R.B., Cleveland, W.S., McRae, J., Terpenning, I.: STL: a seasonal-trend decomposition procedure based on loess. J. Off. Stat. 6(1), 3–33 (1990)

    Google Scholar 

  8. Díaz, J., García, R., Ribera, P., Alberdi, J.C., Hernández, E., Pajares, M.S., Otero, A.: Modeling of air pollution and its relationship with mortality and morbidity in Madrid, Spain. Int. Arch. Occup. Environ. Health 72(6), 366–376 (1999). https://doi.org/10.1007/s004200050388

    Article  Google Scholar 

  9. Gamboa, J.C.B.: Deep learning for time-series analysis. CoRR abs/1701.01887 (2017). http://arxiv.org/abs/1701.01887

  10. García, S., Fernández, A., Luengo, J., Herrera, F.: A study of statistical techniques and performance measures for genetics-based machine learning: accuracy and interpretability. Soft Comput. 13(10), 959–977 (2009)

    Article  Google Scholar 

  11. García, S., Molina, D., Lozano, M., Herrera, F.: A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 special session on real parameter optimization. J. Heuristics 15(6), 617–644 (2009)

    Article  Google Scholar 

  12. Garcia-Pedrero, A., Gomez-Gil, P.: Time series forecasting using recurrent neural networks and wavelet reconstructed signals. In: 2010 20th International Conference on Electronics Communications and Computers (CONIELECOMP), pp. 169–173, February 2010. https://doi.org/10.1109/CONIELECOMP.2010.5440775

  13. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press (2016). http://www.deeplearningbook.org

  14. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  15. Lago, J., Ridder, F.D., Schutter, B.D.: Forecasting spot electricity prices: deep learning approaches and empirical comparison of traditional algorithms. Appl. Energy 221, 386–405 (2018). https://doi.org/10.1016/j.apenergy.2018.02.069. http://www.sciencedirect.com/science/article/pii/S030626191830196X

    Article  Google Scholar 

  16. LeCun, Y.: Generalization and network design strategies. University of Toronto, Technical report (1989)

    Google Scholar 

  17. Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998). https://doi.org/10.1109/5.726791

    Article  Google Scholar 

  18. Linares, C., Díaz, J., Tobías, A., Miguel, J.M.D., Otero, A.: Impact of urban air pollutants and noise levels over daily hospital admissions in children in Madrid: a time series analysis. Int. Arch. Occup. Environ. Health 79(2), 143–152 (2006). https://doi.org/10.1007/s00420-005-0032-0

    Article  Google Scholar 

  19. Lipton, Z.C.: A critical review of recurrent neural networks for sequence learning. CoRR abs/1506.00019 (2015). http://arxiv.org/abs/1506.00019

  20. Méndez-Jiménez, I., Cárdenas-Montes, M.: Modelling and forecasting of the \(^{222}{R}n\) radiation level time series at the Canfranc Underground Laboratory. In: de CosJuez, F. (ed.) HAIS 2018. LNCS, vol. 10870, pp. 158–170. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92639-1_14

    Chapter  Google Scholar 

  21. Passengers Time Series: https://datamarket.com/data/set/22u3/international-airline-passengers-monthly-totals-in-thousands-jan-49-dec-60

  22. Qiu, X., Zhang, L., Ren, Y., Suganthan, P.N., Amaratunga, G.A.J.: Ensemble deep learning for regression and time series forecasting. In: 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning, CIEL 2014, Orlando, FL, USA, 9–12 December 2014, pp. 21–26 (2014). https://doi.org/10.1109/CIEL.2014.7015739

  23. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986). https://doi.org/10.1038/323533a0

    Article  MATH  Google Scholar 

  24. Walid, Alamsyah: Recurrent neural network for forecasting time series with long memory pattern. J. Phys.: Conf. Ser. 824(1), 012038 (2017). http://stacks.iop.org/1742-6596/824/i=1/a=012038

    Google Scholar 

  25. Wang, H.Z., Li, G.Q., Wang, G.B., Peng, J.C., Jiang, H., Liu, Y.T.: Deep learning based ensemble approach for probabilistic wind power forecasting. Appl. Energy 188, 56–70 (2017). https://doi.org/10.1016/j.apenergy.2016.11.111. http://www.sciencedirect.com/science/article/pii/S0306261916317421

    Article  Google Scholar 

  26. Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: a strong baseline. CoRR abs/1611.06455 (2016). http://arxiv.org/abs/1611.06455

  27. Zheng, Y., Liu, Q., Chen, E., Ge, Y., Zhao, J.L.: Time series classification using multi-channels deep convolutional neural networks. In: Li, F., Li, G., Hwang, S., Yao, B., Zhang, Z. (eds.) WAIM 2014. LNCS, vol. 8485, pp. 298–310. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-08010-9_33

    Chapter  Google Scholar 

  28. Zheng, Y., Liu, Q., Chen, E., Ge, Y., Zhao, J.L.: Exploiting multi-channels deep convolutional neural networks for multivariate time series classification. Front. Comput. Sci. 10(1), 96–112 (2016). https://doi.org/10.1007/s11704-015-4478-2

    Article  Google Scholar 

Download references

Acknowledgment

The research leading to these results has received funding by the Spanish Ministry of Economy and Competitiveness (MINECO) for funding support through the grant FPA2016-80994-C2-1-R, and “Unidad de Excelencia María de Maeztu”: CIEMAT - FÍSICA DE PARTÍCULAS through the grant MDM-2015-0509.

IMJ is co-funded in a 91.89 percent by the European Social Fund within the Youth Employment Operating Program, for the programming period 2014–2020, as well as Youth Employment Initiative (IEJ). IMJ is also co-funded through the Grants for the Promotion of Youth Employment and Implantation of Youth Guarantee in Research and Development and Innovation (I+D+i) from the MINECO.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Miguel Cárdenas-Montes .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Méndez-Jiménez, I., Cárdenas-Montes, M. (2018). Time Series Decomposition for Improving the Forecasting Performance of Convolutional Neural Networks. In: Herrera, F., et al. Advances in Artificial Intelligence. CAEPIA 2018. Lecture Notes in Computer Science(), vol 11160. Springer, Cham. https://doi.org/10.1007/978-3-030-00374-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-00374-6_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-00373-9

  • Online ISBN: 978-3-030-00374-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics