Advertisement

Detecting Methane Outbreaks from Time Series Data with Deep Neural Networks

  • Krzysztof PawłowskiEmail author
  • Karol Kurach
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9437)

Abstract

Hazard monitoring systems play a key role in ensuring people’s safety. The problem of detecting dangerous levels of methane concentration in a coal mine was a subject of IJCRS’15 Data Challenge competition. The challenge was to predict, from multivariate time series data collected by sensors, if methane concentration reaches a dangerous level in the near future. In this paper we present our solution to this problem based on the ensemble of Deep Neural Networks. In particular, we focus on Recurrent Neural Networks with Long Short-Term Memory (LSTM) cells.

Keywords

Machine learning Recurrent neural networks Ensemble methods Time series forecasting Hazard monitoring systems 

References

  1. 1.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. CoRR abs/1409.0473 (2014). http://arxiv.org/abs/1409.0473
  2. 2.
    Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)zbMATHGoogle Scholar
  3. 3.
    Collobert, R., Kavukcuoglu, K., Farabet, C.: Torch7: A matlab-like environment for machine learning. In: BigLearn, NIPS Workshop, No. EPFL-CONF-192376 (2011)Google Scholar
  4. 4.
    Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Netw. 5(2), 240–254 (1994)CrossRefGoogle Scholar
  5. 5.
    Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, p. 1. Springer, Heidelberg (2000) CrossRefGoogle Scholar
  6. 6.
    Donoghue, A.: Occupational health hazards in mining: an overview. Occup. Med. 54(5), 283–289 (2004)CrossRefGoogle Scholar
  7. 7.
    Freund, Y., Schapire, R., Abe, N.: A short introduction to boosting. J.-Jpn. Soc. Artif. Intell. 14(771–780), 1612 (1999)Google Scholar
  8. 8.
    Giles, C.L., Lawrence, S., Tsoi, A.C.: Noisy time series prediction using recurrent neural networks and grammatical inference. Mach. Learn. 44(1–2), 161–183 (2001)CrossRefGoogle Scholar
  9. 9.
    Girosi, F., Jones, M.B., Poggio, T.: Regularization theory and neural networks architectures. Neural comput. 7(2), 219–269 (1995)CrossRefGoogle Scholar
  10. 10.
    Graves, A., Mohamed, A.R., Hinton, G.: Speech recognition with deep recurrent neural networks. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2013, pp. 6645–6649. IEEE (2013)Google Scholar
  11. 11.
    Graves, A.: Generating sequences with recurrent neural networks, CoRR abs/1308.0850 (2013). http://arxiv.org/abs/1308.0850
  12. 12.
    Graves, A., Liwicki, M., Fernández, S., Bertolami, R., Bunke, H., Schmidhuber, J.: A novel connectionist system for unconstrained handwriting recognition. IEEE Trans. Pattern Anal. Mach. Intell. 31(5), 855–868 (2009)CrossRefGoogle Scholar
  13. 13.
    Graves, A., Wayne, G., Danihelka, I.: Neural turing machines, CoRR abs/1410.5401 (2014). http://arxiv.org/abs/1410.5401
  14. 14.
    Hinton, G., Deng, L., Yu, D., Dahl, G.E., Mohamed, A.R., Jaitly, N., Senior, A., Vanhoucke, V., Nguyen, P., Sainath, T.N., et al.: Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process. Mag. 29(6), 82–97 (2012)CrossRefGoogle Scholar
  15. 15.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  16. 16.
    Janusz, A., Ślęzak, D., Sikora, M., Wróbel, L., Stawicki, S., Grzegorowski, M., Wojtas, P.: Mining data from coal mines: IJCRS 2015 data challenge. In: Proceedings of IJCRS 2015. LNCS, Springer (2015), in print November 2015Google Scholar
  17. 17.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems, pp. 1097–1105 (2012)Google Scholar
  18. 18.
    Le, Q.V.: Building high-level features using large scale unsupervised learning. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2013, pp. 8595–8598. IEEE (2013)Google Scholar
  19. 19.
    Prechelt, L.: Early stopping - but when? In: Orr, G.B., Müller, K.-R. (eds.) NIPS-WS 1996. LNCS, vol. 1524, p. 55. Springer, Heidelberg (1998) CrossRefGoogle Scholar
  20. 20.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  21. 21.
    Sundermeyer, M., Schlüter, R., Ney, H.: Lstm neural networks for language modeling. In: INTERSPEECH (2012)Google Scholar
  22. 22.
    Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)Google Scholar
  23. 23.
    Szlązak, N., Obracaj, D., Borowski, M., Swolkień, J., Korzec, M.: Monitoring and controlling methane hazard in excavations in hard coal mines. AGH J. Min. Geoengineering 37, 105–116 (2013)CrossRefGoogle Scholar
  24. 24.
    Werbos, P.: Beyond regression: new tools for prediction and analysis in the behavioral sciences, Ph.D. thesis, Harvard University, Cambridge (1974)Google Scholar
  25. 25.
    Werbos, P.J.: Generalization of backpropagation with application to a recurrent gas market model. Neural Netw. 1(4), 339–356 (1988)CrossRefGoogle Scholar
  26. 26.
    Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)CrossRefGoogle Scholar
  27. 27.
    Zaremba, W., Sutskever, I., Vinyals, O.: Recurrent neural network regularization, CoRR abs/1409.2329 (2014). http://arxiv.org/abs/1409.2329

Copyright information

© Springer International Publishing Switzerland 2015

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Faculty of Mathematics, Informatics and MechanicsUniversity of WarsawWarsawPoland

Personalised recommendations