Skip to main content

Ensemble Deep Learning for Forecasting \(^{222}Rn\) Radiation Level at Canfranc Underground Laboratory

  • Conference paper
  • First Online:

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 950))

Abstract

Ensemble Deep Learning Architectures have demonstrated to improve the performance in comparison with the individual architectures composing the ensemble. In the current work, an ensemble of variants of Convolutional and Recurrent Neural Networks architectures are applied to the prediction of the \(^{222}Rn\) level at the Canfranc Underground Laboratory (Spain). To predict the low-level periods allows appropriately scheduling the maintenance operations in the experiments hosted in the laboratory. As a consequence of the application of Ensemble Deep Learning, an improvement of the forecasting capacity is stated. Furthermore, the learned lessons from this work can be extrapolated to other underground laboratories around the world.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bettini, A.: New underground laboratories: Europe, Asia and the Americas. Phys. Dark Universe 4(Suppl. C), 36–40 (2014). https://doi.org/10.1016/j.dark.2014.05.006. dARK TAUP2013

    Article  Google Scholar 

  2. Chniti, G., Bakir, H., Zaher, H.: E-commerce time series forecasting using LSTM neural network and support vector regression. In: Proceedings of the International Conference on Big Data and Internet of Thing, BDIOT 2017, pp. 80–84. ACM, New York (2017). https://doi.org/10.1145/3175684.3175695

  3. Chollet, F., et al.: Keras (2015). https://github.com/fchollet/keras

  4. Cleveland, R.B., Cleveland, W.S., McRae, J., Terpenning, I.: STL: a seasonal-trend decomposition procedure based on loess. J. Off. Stat. 3, 3–73 (1990)

    Google Scholar 

  5. Gamboa, J.C.B.: Deep learning for time-series analysis. CoRR abs/1701.01887 (2017). http://arxiv.org/abs/1701.01887

  6. García, S., Fernández, A., Luengo, J., Herrera, F.: A study of statistical techniques and performance measures for genetics-based machine learning: accuracy and interpretability. Soft Comput. 13(10), 959–977 (2009)

    Article  Google Scholar 

  7. García, S., Molina, D., Lozano, M., Herrera, F.: A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC 2005 special session on real parameter optimization. J. Heuristics 15(6), 617–644 (2009)

    Article  Google Scholar 

  8. Garcia-Pedrero, A., Gomez-Gil, P.: Time series forecasting using recurrent neural networks and wavelet reconstructed signals. In: 2010 20th International Conference on Electronics Communications and Computers (CONIELECOMP), pp. 169–173, February 2010. https://doi.org/10.1109/CONIELECOMP.2010.5440775

  9. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press, Cambridge (2016)

    MATH  Google Scholar 

  10. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  11. Lago, J., Ridder, F.D., Schutter, B.D.: Forecasting spot electricity prices: deep learning approaches and empirical comparison of traditional algorithms. Appl. Energy 221, 386–405 (2018). https://doi.org/10.1016/j.apenergy.2018.02.069, http://www.sciencedirect.com/science/article/pii/S030626191830196X

    Article  Google Scholar 

  12. LeCun, Y.: Generalization and network design strategies. University of Toronto, Technical report (1989)

    Google Scholar 

  13. Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998). https://doi.org/10.1109/5.726791

    Article  Google Scholar 

  14. Lipton, Z.C.: A critical review of recurrent neural networks for sequence learning. CoRR abs/1506.00019 (2015). http://arxiv.org/abs/1506.00019

  15. Méndez-Jiménez, I., Cárdenas-Montes, M.: Modelling and forecasting of the \(^{222}{R}n\) radiation level time series at the Canfranc Underground Laboratory. In: Proceedings of Hybrid Artificial Intelligent Systems - 13th International Conference, HAIS 2018, Oviedo, Spain, 20–22 June 2018. Lecture Notes in Computer Science, vol. 10870, pp. 158–170. Springer (2018)

    Google Scholar 

  16. Méndez-Jiménez, I., Cárdenas-Montes, M.: Time series decomposition for improving the forecasting performance of convolutional neural networks. In: Proceedings of Advances in Artificial Intelligence - 18th Conference of the Spanish Association for Artificial Intelligence, CAEPIA 2018, Granada, Spain, 23–26 October 2018. Lecture Notes in Computer Science, vol. 11160, pp. 87–97. Springer (2018). https://doi.org/10.1007/978-3-030-00374-6_9

    Chapter  Google Scholar 

  17. Qiu, X., Zhang, L., Ren, Y., Suganthan, P.N., Amaratunga, G.A.J.: Ensemble deep learning for regression and time series forecasting. In: 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning, CIEL 2014, Orlando, FL, USA, 9–12 December 2014, pp. 21–26 (2014). https://doi.org/10.1109/CIEL.2014.7015739

  18. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986). https://doi.org/10.1038/323533a0

    Article  MATH  Google Scholar 

  19. Schuster, M., Paliwal, K.: Bidirectional recurrent neural networks. Trans. Sig. Proc. 45(11), 2673–2681 (1997). https://doi.org/10.1109/78.650093

    Article  Google Scholar 

  20. Walid, A.: Recurrent neural network for forecasting time series with long memory pattern. J. Phys.: Conf. Ser. 824(1), 012038 (2017). http://stacks.iop.org/1742-6596/824/i=1/a=012038

    Google Scholar 

  21. Wang, H.Z., Li, G.Q., Wang, G.B., Peng, J.C., Jiang, H., Liu, Y.T.: Deep learning based ensemble approach for probabilistic wind power forecasting. Appl. Energy 188, 56–70 (2017). https://doi.org/10.1016/j.apenergy.2016.11.111. http://www.sciencedirect.com/science/article/pii/S0306261916317421

    Article  Google Scholar 

  22. Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: a strong baseline. CoRR abs/1611.06455 (2016). http://arxiv.org/abs/1611.06455

  23. Zheng, Y., Liu, Q., Chen, E., Ge, Y., Zhao, J.L.: Time series classification using multi-channels deep convolutional neural networks. In: Li, F., Li, G., Hwang, S.W., Yao, B., Zhang, Z. (eds.) Web-Age Information Management, pp. 298–310. Springer, Cham (2014)

    Google Scholar 

  24. Zheng, Y., Liu, Q., Chen, E., Ge, Y., Zhao, J.L.: Exploiting multi-channels deep convolutional neural networks for multivariate time series classification. Front. Comput. Sci. 10(1), 96–112 (2016). https://doi.org/10.1007/s11704-015-4478-2

    Article  Google Scholar 

Download references

Acknowledgment

The research leading to these results has received funding by the Spanish Ministry of Economy and Competitiveness (MINECO) for funding support through the grant FPA2016-80994-C2-1-R, and “Unidad de Excelencia María de Maeztu”: CIEMAT - FÍSICA DE PARTÍCULAS through the grant MDM-2015-0509.

IMJ is co-funded in a 91.89 percent by the European Social Fund within the Youth Employment Operating Program, for the programming period 2014–2020, as well as Youth Employment Initiative (IEJ). IMJ is also co-funded through the Grants for the Promotion of Youth Employment and Implantation of Youth Guarantee in Research and Development and Innovation (I+D+i) from the MINECO.

The authors would like to thank Roberto Santorelli, Pablo García Abia and Vicente Pesudo for useful comments regarding the Physics related aspects of this work, and the Underground Laboratory of Canfranc by providing valuable feedback.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Miguel Cárdenas-Montes .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cárdenas-Montes, M., Méndez-Jiménez, I. (2020). Ensemble Deep Learning for Forecasting \(^{222}Rn\) Radiation Level at Canfranc Underground Laboratory. In: Martínez Álvarez, F., Troncoso Lora, A., Sáez Muñoz, J., Quintián, H., Corchado, E. (eds) 14th International Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO 2019). SOCO 2019. Advances in Intelligent Systems and Computing, vol 950. Springer, Cham. https://doi.org/10.1007/978-3-030-20055-8_15

Download citation

Publish with us

Policies and ethics