Advertisement

Nonparametric Regression Models for Data Streams Based on the Generalized Regression Neural Networks

  • Leszek RutkowskiEmail author
  • Maciej Jaworski
  • Piotr Duda
Chapter
Part of the Studies in Big Data book series (SBD, volume 56)

Abstract

The literature concerning the supervised learning algorithms in data stream mining is dominated mainly by pattern classification methods. Only few of them deal with a non-stationary regression. Most of them rely on the Gaussian or Markov models, extend Support Vector Machine or Extreme Learning Machine to regression problems, implement regression trees or polynomial regression for working in a non-stationary environment. We will briefly describe these approaches.

References

  1. 1.
    Wang, Y., Chaib-Draa, B.: KNN-based Kalman filter: an efficient and non-stationary method for Gaussian process regression. Knowl. Based Syst. 114, 148–155 (2016)Google Scholar
  2. 2.
    Huber, M.F.: Recursive Gaussian process: on-line regression and learning. Pattern Recognit. Lett. 45, 85–91 (2014)Google Scholar
  3. 3.
    Csató, L., Opper, M.: Sparse on-line Gaussian processes. Neural Comput. 14(3), 641–668 (2002)zbMATHGoogle Scholar
  4. 4.
    Hartikainen, J., Särkkä, S.: Kalman filtering and smoothing solutions to temporal Gaussian process regression models. In: 2010 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), pp. 379–384. IEEE (2010)Google Scholar
  5. 5.
    Nguyen-Tuong, D., Peters, J.R., Seeger, M.: Real-time local gp model learning. In: From Motor Learning to Interaction Learning in Robots, pp. 193–207. Springer (2010)Google Scholar
  6. 6.
    Hamilton, J.D.: A new approach to the economic analysis of nonstationary time series and the business cycle. Econ. J. Econ. Soc. 357–384 (1989)Google Scholar
  7. 7.
    Fornaciari, M., Grillenzoni, C.: Evaluation of on-line trading systems: Markov-switching vs time-varying parameter models. Decis. Support. Syst. 93, 51–61 (2017)Google Scholar
  8. 8.
    Liu, J., Zio, E.: An adaptive online learning approach for support vector regression: online-SVR-FID. Mech. Syst. Signal Process. 76, 796–809 (2016)Google Scholar
  9. 9.
    Liu, J., Zio, E.: A svr-based ensemble approach for drifting data streams with recurring patterns. Appl. Soft Comput. 47, 553–564 (2016)Google Scholar
  10. 10.
    Ni, J., Zhang, C., Yang, S.X.: An adaptive approach based on KPCA and SVM for real-time fault diagnosis of HVCBs. IEEE Trans. Power Deliv. 26(3), 1960–1971 (2011)Google Scholar
  11. 11.
    Wang, D., Zhang, B., Zhang, P., Qiao, H.: An online core vector machine with adaptive meb adjustment. Pattern Recognit. 43(10), 3468–3482 (2010)zbMATHGoogle Scholar
  12. 12.
    Jung, T., Polani, D.: Sequential learning with LS-SVM for large-scale data sets. In: International Conference on Artificial Neural Networks, pp. 381–390, Springer (2006)Google Scholar
  13. 13.
    Huang, G.-B., Liang, N.-Y., Rong, H.-J., Saratchandran, P., Sundararajan, N.: On-line sequential extreme learning machine. Comput. Intell. 2005, 232–237 (2005)Google Scholar
  14. 14.
    Lim, J.-S., Lee, S., Pang, H.-S.: Low complexity adaptive forgetting factor for online sequential extreme learning machine (OS-ELM) for application to nonstationary system estimations. Neural Comput. Appl. 22(3–4), 569–576 (2013)Google Scholar
  15. 15.
    Huang, G.-B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybernet. Part B (Cybernet.) 42(2), 513–529 (2012)Google Scholar
  16. 16.
    Wang, X., Han, M.: Online sequential extreme learning machine with kernels for nonstationary time series prediction. Neurocomputing 145, 90–97 (2014)Google Scholar
  17. 17.
    Ye, Y., Squartini, S., Piazza, F.: Online sequential extreme learning machine in nonstationary environments. Neurocomputing 116, 94–101 (2013)Google Scholar
  18. 18.
    Nobrega, J.P., Oliveira, A.L.: Kalman filter-based method for online sequential extreme learning machine for regression problems. Eng. Appl. Artif. Intell. 44, 101–110 (2015)Google Scholar
  19. 19.
    Ikonomovska, E., Gama, J., Džeroski, S.: Online tree-based ensembles and option trees for regression on evolving data streams. Neurocomputing 150, 458–470 (2015)Google Scholar
  20. 20.
    Ikonomovska, E., Gama, J., Sebastião, R., Džeroski, S.: Regression trees from data streams with drift detection. In: International Conference on Discovery Science, pp. 121–135, Springer (2009)Google Scholar
  21. 21.
    Zhang, P., Song, D., Wang, J., Hou, Y.: Bias-variance analysis in estimating true query model for information retrieval. Inf. Process. Manag. 50(1), 199–217 (2014)Google Scholar
  22. 22.
    Yao, F., Müller, H.-G.: Functional quadratic regression. Biometrika, 49–64 (2010)Google Scholar
  23. 23.
    Chen, Y.-H., Hsu, N.-J.: A frequency domain test for detecting nonstationary time series. Comput. Stat. Data Anal. 75, 179–189 (2014)MathSciNetzbMATHGoogle Scholar
  24. 24.
    Shelef, A.: A gini-based unit root test. Comput. Stat. Data Anal. 100, 763–772 (2016)MathSciNetzbMATHGoogle Scholar
  25. 25.
    Lughofer, E., Pratama, M.: Online active learning in data stream regression using uncertainty sampling based on evolving generalized Fuzzy models. IEEE Trans. Fuzzy Syst. 26(1), 292–309 (2018)Google Scholar
  26. 26.
    Li, F., Zhang, X., Zhang, X., Du, C., Xu, Y., Tian, Y.-C.: Cost-sensitive and hybrid-attribute measure multi-decision tree over imbalanced data sets. Inf. Sci. 422, 242–256 (2018)Google Scholar
  27. 27.
    Gordon, D., Farhadi, A., Fox, D.: Re\(^3\): Real-time recurrent regression networks for visual tracking of generic objects. IEEE Robot. Autom. Lett. 3(2), 788–795 (2018)Google Scholar
  28. 28.
    Specht, D.: A general regression neural network. IEEE Trans. Neural Netw. 2, 568–576 (1991)Google Scholar
  29. 29.
    Patterson, D.: Artificial Neural Networks, Theory and Applications. Prentice Hall, New York. (1995)Google Scholar
  30. 30.
    Yee, P., Haykin, S.: Regularized Radial Basis Function Network Theory and Applications. Wiley, New York (2001)zbMATHGoogle Scholar
  31. 31.
    Ahmad, I., Lin, P.: Nonparametric sequential estimation of a multiple regression function. Bull. Math. 17, 63–75 (1976)MathSciNetzbMATHGoogle Scholar
  32. 32.
    Ahmad, I., Lin, P.: Fitting a multiple regression. J. Stat. Plan. Inference 2, 163–176 (1984)MathSciNetzbMATHGoogle Scholar
  33. 33.
    Antos, A., Györfi, L., Kohler, M.: Lower bounds on the rate of convergence of nonparametric regression estimates. J. Stat. Plan. Inference 83, 91–100 (2000)MathSciNetzbMATHGoogle Scholar
  34. 34.
    Devroye, L.: Universal Consistency in Nonparametric Regression and Nonparametric Discrimination. Tech. Report. School of Computer Science, Mc Gill Univerity (1978)Google Scholar
  35. 35.
    Devroye, L., Wagner, T.: On the convergence of kernel estimators of regression functions with applications in discrimination. Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 51, 15–21 (1980)MathSciNetzbMATHGoogle Scholar
  36. 36.
    Devroye, L.: Necessary and sufficient conditions for the almost everywhere convergence of nearest neighbor regression function estimates. Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 61, 467–481 (1982)MathSciNetzbMATHGoogle Scholar
  37. 37.
    Devroye, L., Krzyżak, A.: An equivalence theorem for \(l_1\) convergence of the kernel regression estimate. J. Stat. Plan. Inference 23, 71–82 (1989)zbMATHGoogle Scholar
  38. 38.
    Devroye, L., Gyöfi, L., Krzyżak, A., Lugosi, G.: On the strong universal consistency of nearest neighbor regression function estimates. Ann. Stat. 22, 1371–1385 (1994)MathSciNetzbMATHGoogle Scholar
  39. 39.
    Georgiev, A.: Consistent nonparametric multiple regression: the fixed design case. J. Multivar. Anal. 25, 100–110 (1988)MathSciNetzbMATHGoogle Scholar
  40. 40.
    Greblicki, W., Krzyżak, A., Pawlak, M.: Ann. Stat. Annals of Statistics 12, 1570–1575 (1984)MathSciNetGoogle Scholar
  41. 41.
    Mack, Y., Silverman, B.: Weak and strong uniform consistency of kernel regression estimates. Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 61, 405–415 (1982)MathSciNetzbMATHGoogle Scholar
  42. 42.
    Rafajłowicz, E.: Nonparametric orthogonal series estimators of regression: a class attaining the optimal convergence rate in \(l_2\). Stat. Probab. Lett. 5, 283–285 (1987)MathSciNetGoogle Scholar
  43. 43.
    Rafałowicz, E., Pawlak, M.: On function recovery by neural networks based on orthogonal expansions. Nonlinear Anal. Theor. Methods 30(3), 1343–1354 (1997)MathSciNetzbMATHGoogle Scholar
  44. 44.
    Rutkowski, L.: Sequential estimates of a regression function by orthogonal series with applications in discrimination. Lectures Notes in Statistics, Springer, New York 8, 236–244 (1981)MathSciNetzbMATHGoogle Scholar
  45. 45.
    Rutkowski, L.: On nonparametric identification with prediction of time-varying systems. IEEE Trans. Autom. Control. AC-29, 58–60 (1984)Google Scholar
  46. 46.
    Rutkowski, L., Rafajłowicz, E.: On global rate of convergence of some nonparametric identification procedures. IEEE Trans. Autom. Control, AC-34(10), 1089–1091 (1989)Google Scholar
  47. 47.
    Stone, C.: Consistent nonparametric regressions. Ann. Stat. 5, 595–645 (1977)MathSciNetzbMATHGoogle Scholar
  48. 48.
    Stone, C.: Optimal global rates of convergence for nonparametric regression. Ann. Stat. 10, 1040–1053 (1982)MathSciNetzbMATHGoogle Scholar
  49. 49.
    Duda, P., Jaworski, M., Rutkowski, L.: On ensemble components selection in data streams scenario with reoccurring concept-drift. In: 2017 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–7 (2017)Google Scholar
  50. 50.
    Duda, P., Jaworski, M., Rutkowski, L.: Convergent time-varying regression models for data streams: Tracking concept drift by the recursive Parzen-based generalized regression neural networks. Int. J. Neural Syst. 28(02), 1750048 (2018)Google Scholar
  51. 51.
    Jaworski, M., Duda, P., Rutkowski, L., Najgebauer, P., Pawlak, M.: Heuristic regression function estimation methods for data streams with concept drift. Lecture Notes in Computer Science 10246, 726–737 (2017)Google Scholar
  52. 52.
    Jaworski, M.: Regression function and noise variance tracking methods for data streams with concept drift. Int. J. Appl. Math. Comput. Sci. 28(3), 559–567 (2018)zbMATHGoogle Scholar
  53. 53.
    Bubnicki, Z.: Identification of Control Plants. Elsevier, Oxford - Amsterdam - New York (1980)zbMATHGoogle Scholar
  54. 54.
    Söderström, T., Stoica, P.: System Identification. Prentice-Hall, England, Englewood Cliffs, NJ (1989)zbMATHGoogle Scholar
  55. 55.
    Wolverton, C., Wagner, T.: Asymptotically optimal discriminant functions for pattern classification. IEEE Trans. Inform. Theor 15, 258–265 (1969)MathSciNetzbMATHGoogle Scholar
  56. 56.
    Yamato, H.: Sequential estimation of a continuous probability density function and the mode. Bull. Math. Statist. 14, 1–12 (1971)MathSciNetzbMATHGoogle Scholar
  57. 57.
    Davies, H.: Strong consistency of a sequential estimator of a probability density function. Bull. Math. Statist 15, 49–53 (1973)MathSciNetzbMATHGoogle Scholar
  58. 58.
    Devroye, L.: On the pointwise and the integral convergence of recursive kernel estimates of probability densitie. Utilitias Math. 15, 113–128 (1979)MathSciNetzbMATHGoogle Scholar
  59. 59.
    Greblicki, W., Krzyżak, A.: Asymptotic properties of kernel estimates of a regression function. J. Statist. Plann. Inference (1980)Google Scholar
  60. 60.
    Greblicki, W., Pawlak, M.: Nonparametric System Identification. Cambridge University Press Cambridge (2008)Google Scholar
  61. 61.
    Sjölin, P.: Convergence almost everywhere of certain singular integrals and multiple Fourier series. Ark. Math. 9, 65–90 (1971)MathSciNetzbMATHGoogle Scholar
  62. 62.
    Walter, G.: Properties of Hermite series estimation of probability density. Annal. Statist. 5, 1258–1264 (1977)MathSciNetzbMATHGoogle Scholar
  63. 63.
    Greblicki, W., Pawlak, M.: Classification using the Fourier series estimate of multivariate density function. IEEE Trans. Syst. Man. Cybernet. (1981)Google Scholar
  64. 64.
    Sansone, G.: Orthogonal Functions. Interscience Publishers Inc., New York (1959)zbMATHGoogle Scholar
  65. 65.
    Rutkowski, L.: Sequential estimates of probability densities by orthogonal series and their application in pattern classification. IEEE Trans. Syst. Man Cybernet. SMC-10(12), 918–920 (1980)Google Scholar
  66. 66.
    Alexits, G.: Convergence Problems of Orthogonal Series. Budapest, Akademiai Kiado, Hungary (1961)zbMATHGoogle Scholar
  67. 67.
    Carleson, L.: On convergence and growth of partial sums of Fourier serie. Acta Math. 116, 135–137 (1966)MathSciNetzbMATHGoogle Scholar
  68. 68.
    Szegö, G.: Orthogonal Polynomials, vol. 23. Amer. Math. Soc. Coll. Publ. (1959)Google Scholar
  69. 69.
    Nikolsky, S.: A Course of Mathematical Analysis. Mir Publishers, Moscow (1977)Google Scholar
  70. 70.
    Stein, E.: Singular Integrals and Differentiability Properties of Function. Princeton University Press Princeton, New Jersey, New Jersey (1970)zbMATHGoogle Scholar
  71. 71.
    Wheeden, R., Zygmunnd, A.: Measure and Integral. Marcel Dekker. INC., New York and Basel (1977)Google Scholar
  72. 72.
    Rutkowski, L.: On-line identification of time-varying systems by nonparametric technique. IEEE Trans. Automat. Control AC-27, 228–230 (1982)Google Scholar
  73. 73.
    Rutkowski, L.: Nonparametric identification of quasi-stationary system. Syst. Control Lett. (1985)Google Scholar
  74. 74.
    Rutkowski, L.: The real-time identification of time-varying systems by nonparametric algorithms based on the Parzen kernel. Int. J. Syst. Sci. (1985)Google Scholar
  75. 75.
    Rutkowski, L.: Nonparametric identification of the co conversion process. In: Proceedings of the IFAC Workshop, Adaptive Control of Chemical Processes, pp. 64 -66 (1985)Google Scholar
  76. 76.
    Rutkowski, L.: Nonparametric learning algorithms in the time-varying environments. Signal Process. 18, 129–137 (1989)MathSciNetGoogle Scholar
  77. 77.
    Rutkowski, L.: An application of multiple Fourier series to identification of multivariable nonstationary systems. Int. J. Syst. Sci. 20(10), 1993–2002 (1989)MathSciNetzbMATHGoogle Scholar
  78. 78.
    Rutkowski, L.: Generalized regression neural networks in time-varying environment. IEEE Trans. Neural Netw. 15(3), 576–596 (2004)Google Scholar
  79. 79.
    Duda, P., Jaworski, M., Rutkowski, L.: Knowledge discovery in data streams with the orthogonal series-based generalized regression neural networks. Informat. Sci. 460–461, 497–518 (2018)MathSciNetGoogle Scholar
  80. 80.
    Rutkowski, L., Cpałka, K.: A neuro-fuzzy controller with a compromise fuzzy reasoning. Control Cybern. 31(2), 297–308 (2002)zbMATHGoogle Scholar
  81. 81.
    Rutkowski, L., Cpałka, K.: Flexible neuro-fuzzy systems. IEEE Trans. Neural Netw. 14(3), 554–574 (2003)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Leszek Rutkowski
    • 1
    • 2
    Email author
  • Maciej Jaworski
    • 1
  • Piotr Duda
    • 1
  1. 1.Institute of Computational IntelligenceCzestochowa University of TechnologyCzęstochowaPoland
  2. 2.Information Technology InstituteUniversity of Social SciencesLodzPoland

Personalised recommendations