Advertisement

VAR-GRU: A Hybrid Model for Multivariate Financial Time Series Prediction

  • Lkhagvadorj Munkhdalai
  • Meijing Li
  • Nipon Theera-Umpon
  • Sansanee Auephanwiriyakul
  • Keun Ho RyuEmail author
Conference paper
  • 253 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12034)

Abstract

A determining the most relevant variables and proper lag length are the most challenging steps in multivariate time series analysis. In this paper, we propose a hybrid Vector Autoregressive and Gated Recurrent Unit (VAR-GRU) model to find the contextual variables and suitable lag length to improve the predictive performance for financial multivariate time series. VAR-GRU approach consists of two layers, the first layer is a VAR model-based variable and lag length selection and in the second layer, the GRU-based multivariate prediction model is trained. In the VAR layer, the Akaike Information Criterion (AIC) is used to select VAR order for finding the optimal lag length. Then, the Granger Causality test with the optimal lag length is utilized to define the causal variables to the second layer GRU model. The experimental results demonstrate that the ability of the proposed hybrid model to improve prediction performance against all base predictors in terms of three evaluation metrics. The model is validated over real-world financial multivariate time series dataset.

Keywords

Multivariate financial time series Vector Autoregressive Grange causality Gated Recurrent Unit 

Notes

Acknowledgements

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (No. 2017R1A2B4010826) and (No. 2019K2A9A2A06020672) in Republic of Korea, and by the National Natural Science Foundation of China (Grant No. 61702324 and Grant No. 61911540482) in People’s Republic of China.

References

  1. 1.
    Beaver, W.H.: Market prices, financial ratios, and the prediction of failure. J. Account. Res. 179–192 (1968).  https://doi.org/10.2307/2490233CrossRefGoogle Scholar
  2. 2.
    Cho, K., Merrienboer, B., Bahdanau, D., Bengio, Y.: On the properties of neural machine translation: Encoder–Decoder approaches. In: Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation, pp. 103–111. Association for Computational Linguistics, Doha, Qatar (2014).  https://doi.org/10.3115/v1/w14-4012
  3. 3.
    Lütkepohl, H.: New Introduction to Multiple Time Series Analysis. Springer, Heidelberg (2005).  https://doi.org/10.1007/978-3-540-27752-1CrossRefzbMATHGoogle Scholar
  4. 4.
    Fischer, T., Krauss, C.: Deep learning with long short-term memory networks for financial market predictions. Eur. J. Oper. Res. 270(2), 654–669 (2018).  https://doi.org/10.1016/j.ejor.2017.11.054MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Ding, X., Zhang, Y., Liu, T., Duan, J.: Deep learning for event-driven stock prediction. In: Twenty-Fourth International Joint Conference on Artificial Intelligence, pp. 2327–2333. IJCAI, Buenos Aires, Argentina (2015)Google Scholar
  6. 6.
    Yang, E., et al.: A simulation-based study on the comparison of statistical and time series forecasting methods for early detection of infectious disease outbreaks. Int. J. Environ Health Res. 15(5), 966 (2018).  https://doi.org/10.3390/ijerph15050966CrossRefGoogle Scholar
  7. 7.
    Wang, J., Wang, J., Fang, W., Niu, H.: Financial time series prediction using Elman recurrent random neural networks. Comput. Intell. Neurosci. 2014, 1–14 (2016).  https://doi.org/10.1155/2016/4742515CrossRefGoogle Scholar
  8. 8.
    Rather, A.M., Agarwal, A., Sastry, V.N.: Recurrent neural network and a hybrid model for prediction of stock returns. Expert Syst. Appl. 42(6), 3234–3241 (2015).  https://doi.org/10.1016/j.eswa.2014.12.003CrossRefGoogle Scholar
  9. 9.
    Munkhdalai, L., et al.: An end-to-end adaptive input selection with dynamic weights for forecasting multivariate time series. IEEE Access 7, 99099–99114 (2019).  https://doi.org/10.1109/ACCESS.2019.2930069CrossRefGoogle Scholar
  10. 10.
    Reinsel, G.C.: Elements of Multivariate Time Series Analysis. Springer, New York (2003)zbMATHGoogle Scholar
  11. 11.
    Jin, C.H., et al.: A SOM clustering pattern sequence-based next symbol prediction method for day-ahead direct electricity load and price forecasting. Energy Convers. Manag. 90, 84–92 (2015).  https://doi.org/10.1016/j.enconman.2014.11.010CrossRefGoogle Scholar
  12. 12.
    Jin, C.H., Pok, G., Park, H.W., Ryu, K.H.: Improved pattern sequence-based forecasting method for electricity load. IEEJ Trans. Electr. Electron. Eng. 9(6), 670–674 (2014).  https://doi.org/10.1002/tee.22024CrossRefGoogle Scholar
  13. 13.
    Islam, F., Shahbaz, M., Ahmed, A.U., Alam, M.M.: Financial development and energy consumption nexus in Malaysia: a multivariate time series analysis. Econ. Model. 30, 435–441 (2013).  https://doi.org/10.1016/j.econmod.2012.09.033CrossRefGoogle Scholar
  14. 14.
    Zhang, G.P.: Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing. 50, 159–175 (2003).  https://doi.org/10.1016/S0925-2312(01)00702-0CrossRefzbMATHGoogle Scholar
  15. 15.
    Guresen, E., Kayakutlu, G., Daim, T.U.: Using artificial neural network models in stock market index prediction. Expert Syst. Appl. 38(8), 10389–10397 (2011).  https://doi.org/10.1016/j.eswa.2011.02.068CrossRefGoogle Scholar
  16. 16.
    Zhong, X., Enke, D.: Forecasting daily stock market return using dimensionality reduction. Expert Syst. Appl. 67, 126–139 (2017).  https://doi.org/10.1016/j.eswa.2016.09.027CrossRefGoogle Scholar
  17. 17.
    Atsalakis, G.S., Valavanis, K.P.: Surveying stock market forecasting techniques–Part II: soft computing methods. Expert Syst. Appl. 36(3), 5932–5941 (2009).  https://doi.org/10.1016/j.eswa.2008.07.006CrossRefGoogle Scholar
  18. 18.
    Zivot, E., Wang, J.: Modeling Financial Time Series with S-Plus®. Springer, New York (2007).  https://doi.org/10.1007/978-0-387-32348-0CrossRefzbMATHGoogle Scholar
  19. 19.
    Akaike, H.: Fitting autoregressive models for prediction. Ann. Inst. Stat. Math. 21(1), 243–247 (1969)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Granger, C.W.: Investigating causal relations by econometric models and cross-spectral methods. Econometrica 424–438 (1969).  https://doi.org/10.2307/1912791CrossRefGoogle Scholar
  21. 21.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003).  https://doi.org/10.1162/153244303322753616CrossRefzbMATHGoogle Scholar
  22. 22.
    Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990).  https://doi.org/10.1207/s15516709cog1402_1CrossRefGoogle Scholar
  23. 23.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997).  https://doi.org/10.1162/neco.1997.9.8.1735CrossRefGoogle Scholar
  24. 24.
    Gulli, A., Pal, S.: Deep Learning with Keras. Packt Publishing Ltd., Birmingham (2017)Google Scholar
  25. 25.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014). https://arxiv.org/abs/1412.6980
  26. 26.
    Abadi, M., et al.: Tensorflow: a system for large-scale machine learning. In: 12th Symposium on Operating Systems Design and Implementation. pp. 265–283. USENIX, Savannah (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Database/Bioinformatics Laboratory, School of Electrical and Computer EngineeringChungbuk National UniversityCheongjuRepublic of Korea
  2. 2.College of Information EngineeringShanghai Maritime UniversityShanghaiChina
  3. 3.Department of Electrical Engineering, Faculty of EngineeringChiang Mai UniversityChiang MaiThailand
  4. 4.Department of Computer Engineering, Faculty of EngineeringChiang Mai UniversityChiang MaiThailand
  5. 5.Faculty of Information TechnologyTon Duc Thang UniversityHo Chi Minh CityVietnam
  6. 6.Department of Computer Science, College of Electrical and Computer EngineeringChungbuk National UniversityCheongjuRepublic of Korea

Personalised recommendations