Dynamic Programming for Bayesian Logistic Regression Learning under Concept Drift

  • Pavel Turkov
  • Olga Krasotkina
  • Vadim Mottl
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8251)

Abstract

A data stream is an ordered sequence of training instances arriving at a rate that does not permit to permanently store them in memory and leads to the necessity of online learning methods when trying to predict some hidden target variable. In addition, concept drift often occurs, what means that the statistical properties of the target variable may change over time. In this paper, we present a framework of solving the online pattern recognition problem in data streams under concept drift. The framework is based on the application of the Bayesian approach to the probabilistic pattern recognition model in terms of logistic regression, hidden Markov model and dynamic programming.

Keywords

online learning concept drift logistic regression hidden Markov model 

References

  1. 1.
    Polikar, R., Upda, L., Upda, S.S., Honavar, V.: Learn++: an incremental learning algorithm for supervised neural networks. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 31(4), 497–508 (2001)CrossRefGoogle Scholar
  2. 2.
    Mongillo, G., Deneve, S.: Online learning with hidden Markov models. Neural Computation 20, 1706–1716 (2008)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Florez-Larrahondo, G., Bridges, S., Hansen, E.A.: Incremental estimation of discrete hidden Markov models based on a new backward procedure. In: Proceedings of the 20th National Conference on Artificial intelligence, vol. 2, pp. 758–763 (2005)Google Scholar
  4. 4.
    Ulas, A., Semerci, M., Yildiz, O.T., Alpaydin, E.: Incremental construction of classifier and discriminant ensembles. Information Sciences 179, 1298–1318 (2009)CrossRefGoogle Scholar
  5. 5.
    Kapp, M., Sabourin, R.R., Maupin, P.: Adaptive incremental learning with an ensemble of support vector machines. In: Proceedings of the 20th International Conference on Pattern Recognition, Istanbul, Turkey, pp. 4048–4051 (2010)Google Scholar
  6. 6.
    Elwell, R., Polikar, R.: Incremental Learning of Concept Drift in Nonstationary Environments. IEEE Transactions on Neural Networks 22(10), 1517–1531 (2011)CrossRefGoogle Scholar
  7. 7.
    Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine Learning 23, 69–101 (1996)Google Scholar
  8. 8.
    Bifet, A., Gavalda, R.: Learning from time-changing data with adaptive windowing. In: SIAM International Conference on Data Mining (2007)Google Scholar
  9. 9.
    Wang, H., Fan, W., Yu, P.S., Han, J.: Mining concept-drifting data streams using ensemble classifiers. In: Proceedings of the Ninth ACM SIGKDD International Conference KDD 2003, pp. 226–235. ACM Press (2003)Google Scholar
  10. 10.
    Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag New York, Inc., Secaucus (2006)Google Scholar
  11. 11.
    Markov, M., Krasotkina, O., Mottl, V.: Time-varying regression model with unknown time-volatility for nonstationary signal analysis. In: Proceedings of the 8th IASTED International Conference on Signal and Image Processing, Honolulu, Hawaii, August 14-16, pp. 534–196 (2006)Google Scholar
  12. 12.
    Grewal, M.S., Andrews, A.P.: Kalman Filtering: Theory and Practice Using MATLAB. Wiley (2008)Google Scholar
  13. 13.
    Turkov, P., Krasotkina, O., Mottl, V.: Bayesian Approach to the Concept Drift in the Pattern Recognition Problems. In: Perner, P. (ed.) MLDM 2012. LNCS (LNAI), vol. 7376, pp. 1–10. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  14. 14.
    Street, W., Kim, Y.: A streaming ensemble algorithm (SEA) for large-scale classification. In: Proc. 7th Int. Conf. on Knowledge Discovery and Data Mining KDD 2001, pp. 377–382 (2001)Google Scholar
  15. 15.
    Jowell, R., The Central Coordinating Team.: European social survey 2002/2003; 2004/2005; 2006/2007. Technical Reports, London: Centre for Comparative Social Surveys, City University (2003, 2005, 2007)Google Scholar
  16. 16.
    Žliobaité, I.: Combining time and space similarity for small size learning under concept drift. In: Rauch, J., Raś, Z.W., Berka, P., Elomaa, T. (eds.) ISMIS 2009. LNCS (LNAI), vol. 5722, pp. 412–421. Springer, Heidelberg (2009)Google Scholar
  17. 17.
    Bifet, A., Holmes, G., Kirkby, R., Pfahringer, B.: MOA: Massive Online Analysis. Journal of Machine Learning Research 11, 1601–1604 (2010), http://sourceforge.net/projects/moa-datastream/ Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Pavel Turkov
    • 1
  • Olga Krasotkina
    • 1
  • Vadim Mottl
    • 2
  1. 1.Tula State UniversityTulaRussia
  2. 2.Computing Center of the Russian Academy of SciencesMoscowRussia

Personalised recommendations