Skip to main content

General Non-parametric Learning Procedure for Tracking Concept Drift

  • Chapter
  • First Online:
Stream Data Mining: Algorithms and Their Probabilistic Properties

Part of the book series: Studies in Big Data ((SBD,volume 56))

Abstract

The problems of learning in non-stationary situations has rarely been a subject of studies even in a parametric case. Historically the first papers on learning in non-stationary environments where occasionally published in the sixties and seventies. The proper tool for solving such a type of problems seemed to be the dynamic stochastic approximation technique [1, 2] as an extension of the Robbins-Monro [3] procedure for the non-stationary case. The traditional procedure of stochastic approximation was also used [4, 5] with a good effect for tracking the changing regression function root.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Dupač, V.: A dynamic stochastic approximation methods. Ann. Math. Stat. 36, 1695–1702 (1965)

    Article  MathSciNet  Google Scholar 

  2. Dupač, V.: Stochastic approximations in the presence of trend. Neural Netw. 5, 283–288 (1966)

    MATH  Google Scholar 

  3. Robbins, H., Monro, S.: A stochastic approximation method. Ann. Math. Stat. 22(1) (1951)

    Google Scholar 

  4. Watanabe, M.: On robbins-monro stochastic approximation method with time varying observations. Bull. Math. Statist. 16, 73–91 (1974)

    MathSciNet  MATH  Google Scholar 

  5. Young, T., Westerberg, R.: Stochastic approximation with a non-stationary regression function. IEEE Trans. Inform. Theory 18, 518–519 (1972)

    Article  Google Scholar 

  6. Fu, K.: Sequential Methods in Pattern Recognition and Machine Learning. Academic, New York (1968)

    MATH  Google Scholar 

  7. Tzypkin, J.: Learning algorithms of pattern recognition in non-stationary condition. In: Watanabe, S. (ed.) Frontiers of Pattern Recognitions, pp. 527–542. Academic Press, New York (1972)

    Chapter  Google Scholar 

  8. Nishida, K., Yamauchi, K.: Learning, detecting, understanding, and predicting concept changes. In: International Joint Conference on Neural Networks. IJCNN 2009, pp. 2280–2287. IEEE (2009)

    Google Scholar 

  9. Minku, L.L., Yao, X.: DDD: a new ensemble approach for dealing with concept drift. IEEE Trans. Knowl. Data Eng. 24(4), 619–633 (2012)

    Article  Google Scholar 

  10. Mahdi, O.A., Pardede, E., Cao, J.: Combination of information entropy and ensemble classification for detecting concept drift in data stream. In: Proceedings of the Australasian Computer Science Week Multiconference, p. 13. ACM (2018)

    Google Scholar 

  11. Liu, A., Zhang, G., Lu, J.: Fuzzy time windowing for gradual concept drift adaptation. In: IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pp. 1–6. IEEE (2017)

    Google Scholar 

  12. Li, P., Wu, X., Hu, X., Wang, H.: Learning concept-drifting data streams with random ensemble decision trees. Neurocomputing 166, 68–83 (2015)

    Article  Google Scholar 

  13. Elwell, R., Polikar, R.: Incremental learning of concept drift in nonstationary environments. IEEE Trans. Neural Netw. 22(10), 1517–1531 (2011)

    Article  Google Scholar 

  14. Alippi, C., Boracchi, G., Roveri, M.: Just-in-time classifiers for recurrent concepts. IEEE Trans. Neural Netw. Learn. Syst. 24(4), 620–634 (2013)

    Article  Google Scholar 

  15. Zliobaite, I., Bifet, A., Pfahringer, B., Holmes, G.: Active learning with drifting streaming data. IEEE Trans. Neural Netw. Learn. Syst. 25(1), 27–39 (2014)

    Article  Google Scholar 

  16. Zhang, T., Zhang, Q., Wang, Q.: Model detection for functional polynomial regression. Comput. Stat. Data Anal. 70, 183–197 (2014)

    Article  MathSciNet  Google Scholar 

  17. Yun, U., Lee, G.: Sliding window based weighted erasable stream pattern mining for stream data applications. Futur. Gener. Comput. Syst. 59, 1–20 (2016)

    Article  Google Scholar 

  18. Yin, X., Huang, K., Hao, H.: De2: dynamic ensemble of ensembles for learning nonstationary data. Neurocomputing 165, 14–22 (2015)

    Article  Google Scholar 

  19. Ye, Y., Squartini, S., Piazza, F.: Online sequential extreme learning machine in nonstationary environments. Neurocomputing 116, 94–101 (2013)

    Article  Google Scholar 

  20. Souto Maior Barros, R., Carvalho Santos, S.G.T.: A large-scale comparison of concept drift detectors. Inf. Sci. 451–452, 348–370 (2018)

    Google Scholar 

  21. Escovedo, T., Koshiyama, A., Abs da Cruz, A., Vellasco, M.: Detecta: abrupt concept drift detection in non-stationary environments. Appl. Soft Comput. 62, 119–133 (2018)

    Google Scholar 

  22. Webb, G.I., Kuan Lee, L., Petitjean, F., Goethals, B.: Understanding concept drift. CoRR (2017). arXiv:1704.00362

  23. Zambon, D., Alippi, C., Livi, L.: Concept drift and anomaly detection in graph streams. In: IEEE Transactions on Neural Networks and Learning Systems, pp. 1–14 (2018)

    Google Scholar 

  24. Gama, J., Žliobaitė, I., Bifet, A., Pechenizkiy, M., Bouchachia, A.: A survey on concept drift adaptation. ACM Comput. Surv. (CSUR) 46(4), 44:1–44:37 (2014)

    Google Scholar 

  25. Ditzler, G., Roveri, M., Alippi, C., Polikar, R.: Learning in nonstationary environments: a survey. IEEE Comput. Intell. Mag. 10(4), 12–25 (2015)

    Article  Google Scholar 

  26. Aizerman, M., Braverman, E., Rozonoer, L.: Theoretical foundations of the potential function method in pattern recognition learning. Autom. Remote Control 25, 821–837 (1964)

    MATH  Google Scholar 

  27. Révész, P.: How to apply the method of stochastic approximation in the nonparametric estimation of a regression function. Mathematische Operationsforschung und Statistik Series Statistics 8, 119–126 (1977)

    Article  MathSciNet  Google Scholar 

  28. Braverman, E., Rozonoer, L.: Convergence of random processes in machine learning theory. Autom. Remote Control 30, 44–64 (1969)

    MATH  Google Scholar 

  29. Sorour, E.: On the convergence of the dynamic stochastic approximation method for stochastic non-linear multidimensional dynamic systems. Cybernetica 14, 28–37 (1978)

    MathSciNet  MATH  Google Scholar 

  30. Uosaki, K.: Application of stochastic approximation to the tracking of a stochastic non-linear dynamic systems. Int. J. Control 18, 1233–1247 (1973)

    Article  MathSciNet  Google Scholar 

  31. Uosaki, K.: Some generalizations of dynamic stochastic approximation process. Ann. Stat. 2, 1042–1048 (1974)

    Article  MathSciNet  Google Scholar 

  32. Chung, K.: On a stochastic approximation methods. Ann. Math. Stat. 25, 463–483 (1954)

    Article  MathSciNet  Google Scholar 

  33. Watanabe, M.: On convergence of asymptotically optimal discriminant functions for pattern classification problem. Bull. Math. Statist. 16, 23–34 (1974)

    MathSciNet  MATH  Google Scholar 

  34. Tucker, H.: A Graduate Course in Probability. Academic, New York (1967)

    MATH  Google Scholar 

  35. Efromovich, S.: Nonparametric Curve Estimation. Methods, Theory and Applications. Springer, New York (1999)

    Google Scholar 

  36. Duda, P., Jaworski, M., Rutkowski, L.: On ensemble components selection in data streams scenario with reoccurring concept-drift. In: 2017 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–7 (2017)

    Google Scholar 

  37. Duda, P., Jaworski, M., Rutkowski, L.: Convergent time-varying regression models for data streams: tracking concept drift by the recursive Parzen-based generalized regression neural networks. Int. J. Neural Syst. 28(02), 1750048 (2018)

    Article  Google Scholar 

  38. Rutkowski, L.: Adaptive probabilistic neural-networks for pattern classification in time-varying environment. IEEE Trans. Neural Netw. 15, 811–827 (2004)

    Article  Google Scholar 

  39. Rutkowski, L.: Generalized regression neural networks in time-varying environment. IEEE Trans. Neural Netw. 15 (2004)

    Google Scholar 

  40. Jaworski, M.: Regression function and noise variance tracking methods for data streams with concept drift. Int. J. Appl. Math. Comput. Sci. 28(3), 559–567 (2018)

    Article  Google Scholar 

  41. Jaworski, M., Duda, P., Rutkowski, L., Najgebauer, P., Pawlak, M.: Heuristic regression function estimation methods for data streams with concept drift. Lecture Notes in Computer Science 10246, 726–737 (2017)

    Article  Google Scholar 

  42. Pietruczuk, L., Rutkowski, L., Maciej, J., Duda, P.: The Parzen kernel approach to learning in non-stationary environment. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 3319–3323 (2014)

    Google Scholar 

  43. Duda, P., Pietruczuk, L., Jaworski, M., Krzyzak, A.: On the Cesaro-means-based orthogonal series approach to learning time-varying regression functions. In: Lecture Notes in Artificial Intelligence, pp. 37–48. Springer, Berlin (2016)

    Google Scholar 

  44. Duda, P., Jaworski, M., Rutkowski, L.: Knowledge discovery in data streams with the orthogonal series-based generalized regression neural networks. Inf. Sci. 460–461, 497–518 (2018)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Leszek Rutkowski .

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Rutkowski, L., Jaworski, M., Duda, P. (2020). General Non-parametric Learning Procedure for Tracking Concept Drift. In: Stream Data Mining: Algorithms and Their Probabilistic Properties. Studies in Big Data, vol 56. Springer, Cham. https://doi.org/10.1007/978-3-030-13962-9_9

Download citation

Publish with us

Policies and ethics