Advertisement

Prediction of High-Dimensional Time Series with Exogenous Variables Using Generalized Koopman Operator Framework in Reproducing Kernel Hilbert Space

  • Jia-Chen HuaEmail author
  • Farzad Noorian
  • Philip H. W. Leong
  • Gemunu Gunaratne
  • Jorge Gonçalves
Conference paper
  • 1.1k Downloads
Part of the Contributions to Statistics book series (CONTRIB.STAT.)

Abstract

We propose a novel methodology to predict high-dimensional time series with exogenous variables using Koopman operator framework, by assuming that the time series are generated by some underlying unknown dynamical system with input as exogenous variables. In order to do that, we first generalize the definition of the original Koopman operator to allow for input to the underlying dynamical system. We then obtain a formulation of the generalized Koopman operator in reproducing kernel Hilbert space (RKHS) and a new derivation of its numerical approximation methods, namely, Extended Dynamic Mode Decomposition (EDMD) and its kernel-based version. We also obtain a statistical interpretation of kernel-based EDMD developed for deterministic Koopman operator by utilizing the connection between RKHS and Gaussian processes regression, and relate it to the stochastic Koopman and Perron–Frobenius operator. In applications, we found that the prediction performance of this methodology is promising in forecasting real-world high-dimensional time series with exogenous variables, including financial markets data. We believe that this methodology will be of interest to the community of scientists and engineers working on quantitative finance, econometrics, system biology, neurosciences, meteorology, oceanography, system identification and control, data mining, machine learning, computational intelligence, and many other fields involving high-dimensional time series and spatiotemporal data.

Keywords

High-dimensional time series Spatiotemporal dynamics Complex system Koopman operator Perron–Frobenius operator Dynamical system Reproducing kernel Hilbert space Gaussian processes Machine learning Data mining Econophysics Financial markets modeling Energy forecasting Collective behavior 

Notes

Acknowledgements

The corresponding author would like to thank Dr. Alexandre Mauroy for insightful discussions on generalizing Koopman operator to systems with input.

References

  1. 1.
    Koopman, B.O.: PNAS 17(5), 315 (1931)Google Scholar
  2. 2.
    Mezić, I.: Nonlinear Dyn. 41(1–3), 309 (2005).  https://doi.org/10.1007/s11071-005-2824-xMathSciNetCrossRefGoogle Scholar
  3. 3.
    Budišić, M., Mohr, R.M., Mezić, I.: Chaos: an Interdisciplinary. J. Nonlinear Sci. 22(4), 047510 (2012).  https://doi.org/10.1063/1.4772195CrossRefGoogle Scholar
  4. 4.
    Rowley, C.W., Mezić, I., Bagheri, S., Schlatter, P., Henningson, D.S.: J. Fluid Mech. 641, 115 (2009).  https://doi.org/10.1017/S0022112009992059MathSciNetCrossRefGoogle Scholar
  5. 5.
    Williams, M.O., Kevrekidis, I.G., Rowley, C.W.: J. Nonlinear Sci. 1–40 (2015).  https://doi.org/10.1007/s00332-015-9258-5MathSciNetCrossRefGoogle Scholar
  6. 6.
    Hua, J.C., Noorian, F., Moss, D., Leong, P.H.W., Gunaratne, G.H.: Nonlinear Dyn. 90(3), 1785 (2017).  https://doi.org/10.1007/s11071-017-3764-yCrossRefGoogle Scholar
  7. 7.
    Proctor, J.L., Brunton, S.L., Kutz, J.N.: arXiv:1602.07647 [math] (2016)
  8. 8.
    Williams, M.O., Rowley, C.W., Kevrekidis, I.G.: arXiv:1411.2260 [math] (2014)
  9. 9.
    Dirac, P.A.M.: Math. Proc. Camb. Philos. Soc. 35(03), 416 (1939).  https://doi.org/10.1017/S0305004100021162MathSciNetCrossRefGoogle Scholar
  10. 10.
    Hofmann, T., Schölkopf, B., Smola, A.J.: Ann. Stat. 36(3), 1171 (2008).  https://doi.org/10.1214/009053607000000677MathSciNetCrossRefGoogle Scholar
  11. 11.
    Rasmussen, C.E., Williams, C.K.I.: Adaptive computation and machine learning. In: Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA (2006)Google Scholar
  12. 12.
    Schölkopf, B., Herbrich, R., Smola, A.J.: Computational Learning Theory. Lecture Notes in Computer Science, pp. 416–426. Springer, Berlin, Heidelberg (2001).  https://doi.org/10.1007/3-540-44581-1_27Google Scholar
  13. 13.
    Lasota, A., Mackey, M.C.: Chaos, Fractals, and Noise. Applied Mathematical Sciences, vol. 97. Springer, New York, NY (1994).  https://doi.org/10.1007/978-1-4612-4286-4CrossRefGoogle Scholar
  14. 14.
    Cvitanović, P., Artuso, R., Mainieri, R., Tanner, G., Vattay, G.: Chaos: Classical and Quantum. Niels Bohr Inst., Copenhagen (2016)Google Scholar
  15. 15.
    Klus, S., Koltai, P., Schütte, C.: J. Comput. Dyn. 3(1), 1 (2016).  https://doi.org/10.3934/jcd.2016003MathSciNetCrossRefGoogle Scholar
  16. 16.
    Klus, S., Nüske, F., Koltai, P., Wu, H., Kevrekidis, I., Schütte, C., Noé, F.: arXiv:1703.10112 [math] (2017)

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Luxembourg Centre for Systems Biomedicine, University of LuxembourgBelvauxLuxembourg
  2. 2.School of Electrical and Information Engineering, University of SydneySydneyAustralia
  3. 3.Department of PhysicsUniversity of HoustonHoustonUSA

Personalised recommendations