Advertisement

Regression Techniques Used in Hydrometeorology

  • Wei Gong
Reference work entry

Abstract

Regression methods play an important role in ensemble forecasting. The atmosphere-land-ocean system is complex and dynamical, which makes it difficult to predict the state of hydrometeorological variables deterministically. Consequently, stochastic approaches become useful for hydrometeorological forecasting. As forecast uncertainty is inevitable, it is of key importance to use regression approaches to extract useful information from raw observational data and forecasts from dynamical models while providing an appropriate estimation of the confidence level of the forecasts. Regression methods are usually used in two ways in ensemble forecasting. One is used as a statistical forecasting model, which accounts for the relationships between predictors and historical observation data. Another is used as a post-processor for the forecasts from dynamical models in order to correct various biases in them and to improve their reliability and skill scores. If the statistical relationships between the dynamical forecasts and the observation data exist, the systematic bias and ensemble distribution errors can be corrected, and associated uncertainty can be reduced. The two means of applying regression approaches share a common statistical foundation. This chapter will give a brief introduction to various common linear/nonlinear regression approaches that have been used or can be potentially applicable in ensemble forecasting.

Keywords

Regression ANOVA Ridge regression Quantile regression Logistic regression Poisson regression Gaussian processes regression Kriging Regression tree Multivariate adaptive regression splines Support vector machine Artificial neural network 

References

  1. L. Breiman, Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefGoogle Scholar
  2. L. Breiman, J. Friedman, C.J. Stone, R.A. Olshen, Classification and Regression Trees (Chapman and Hall/CRC, Boca Raton, 1984)Google Scholar
  3. G. Cybenko, Approximation by superpositions of a sigmoidal function. Math. Control Signals Syst. 2(4), 303–314 (1989).  https://doi.org/10.1007/BF02551274CrossRefGoogle Scholar
  4. Q.Y. Duan, V.K. Gupta, S. Sorooshian, Shuffled complex evolution approach for effective and efficient global minimization. J. Optim. Theory Appl. 76(3), 501–521 (1993).  https://doi.org/10.1007/bf00939380CrossRefGoogle Scholar
  5. J.H. Friedman, Multivariate adaptive regression splines. Ann. Stat. 19(1), 1–14 (1991)CrossRefGoogle Scholar
  6. Y. Guo, Y. Liu, A. Oerlemans, S. Lao, S. Wu, M.S. Lew, Deep learning for visual understanding: A review. Neurocomputing 187, 27–48 (2016).  https://doi.org/10.1016/j.neucom.2015.09.116CrossRefGoogle Scholar
  7. T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning, 2nd edn. (Springer, New York, 2009)CrossRefGoogle Scholar
  8. A.E. Hoerl, R.W. Kennard, Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970).  https://doi.org/10.1080/00401706.1970.10488634CrossRefGoogle Scholar
  9. A.K. Jain, M. Jianchang, K.M. Mohiuddin, Artificial neural networks: A tutorial. Computer 29(3), 31–44 (1996).  https://doi.org/10.1109/2.485891CrossRefGoogle Scholar
  10. R. Koenker, G. Bassett, Regression quantiles. Econometrica 46(1), 33–50 (1978).  https://doi.org/10.2307/1913643CrossRefGoogle Scholar
  11. Y. LeCun, Y. Bengio, G. Hinton, Deep learning. Nature 521(7553), 436 (2015).  https://doi.org/10.1038/nature14539CrossRefGoogle Scholar
  12. P. López López, J.S. Verkade, A.H. Weerts, D.P. Solomatine, Alternative configurations of quantile regression for estimating predictive uncertainty in water level forecasts for the upper Severn River: A comparison. Hydrol. Earth Syst. Sci. 18(9), 3411–3428 (2014).  https://doi.org/10.5194/hess-18-3411-2014CrossRefGoogle Scholar
  13. D.W. Marquardt, An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11(2), 431–441 (1963).  https://doi.org/10.2307/2098941CrossRefGoogle Scholar
  14. M. Minsky, S.A. Papert, Perceptrons: An Introduction to Computational Geometry (MIT Press, Cambridge, MA, 1969)Google Scholar
  15. C.E. Rasmussen, C.K.I. Williams, Gaussian Processes for Machine Learning (MIT Press, Cambridge, MA, 2006)Google Scholar
  16. J. Schmidhuber, Deep learning in neural networks: An overview. Neural Netw. 61, 85–117 (2015).  https://doi.org/10.1016/j.neunet.2014.09.003CrossRefGoogle Scholar
  17. V.N. Vapnik, Estimation of Dependencies Based on Empirical Data (Springer, New York, 1982)Google Scholar
  18. V.N. Vapnik, The Nature of Statistical Learning Theory, 2nd edn. (Springer, New York, 2002)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.State Key Laboratory of Earth Surface Processes and Resource Ecology, Faculty of Geographical ScienceBeijing Normal UniversityBeijingChina
  2. 2.Institute of Land Surface System and Sustainable Development, Faculty of Geographical ScienceBeijing Normal UniversityBeijingChina

Section editors and affiliations

  • James Brown
    • 1
  • Wei Gong
    • 2
  1. 1.Hydrologic Solutions LimitedSouthamptonUK
  2. 2.College of Global Change and Earth System Science, Beijing Normal UniversityBeijingChina

Personalised recommendations