Skip to main content

Dynamic Ensemble of Ensembles in Nonstationary Environments

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8227))

Abstract

Classifier ensemble is an active topic for learning from non-stationary data. In particular, batch growing ensemble methods present one important direction for dealing with concept drift involved in non-stationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles (DE2). The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of non-stationary data. DE2 includes two key stages: (1) Component classifiers and interim ensembles are dynamically trained; (2) the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. We engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn + + .NSE for better integrating different classifiers dynamically. We perform experiments with the data in a typical non-stationary environment, the Pascal Large Scale Learning Challenge 2008 Webspam Data, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in non-stationary environments.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kolter, J., Maloof, M.: Dynamic weighted majority: An ensemble method for dirfting concepts. Journal of Machine Learning Research 8(2), 2755–2790 (2007)

    MATH  Google Scholar 

  2. Schlimmer, J., Granger, R.: Beyond incremental processing: Tracking concept drift. In: Proceedings of National Conference on Artificial Intelligence, pp. 502–507 (1986)

    Google Scholar 

  3. Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine Learning 23(1), 69–101 (1996)

    Google Scholar 

  4. Kuncheva, L.: Classifier ensembles for changing environments. In: Proceedings of International Workshop on Multiple Classifier Systems, pp. 1–15 (2004)

    Google Scholar 

  5. Littlestone, N., Warmuth, M.: The weighted majority algorithm. Information Computation 108(2), 212–261 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  6. Herbster, M., Warmuth, M.: Tracking the best expert. Machine Learning 32(2), 151–178 (1998)

    Article  MATH  Google Scholar 

  7. Bousquet, O., Warmuth, M.: Tracking a small set of experts by mixing past posteriors. Journal of Machine Learning Research 3(1), 363–396 (2002)

    MathSciNet  Google Scholar 

  8. Kolter, J., Maloof, M.: Using additive expert ensembles to cope with concept drift. In: Proceedings of International Conference on Machine Learning, pp. 449–456 (2005)

    Google Scholar 

  9. Kolter, J., Maloof, M.: Dynamic weighted majority: An ensemble method for dirfting concepts. In: Proceedings of IEEE International Conference on Data Mining, pp. 123–130 (2003)

    Google Scholar 

  10. Street, W., Kim, Y.: A streaming ensemble algorithm (SEA) for large-scale classification. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 377–382 (2001)

    Google Scholar 

  11. Fan, W.: Streamminer: A classifier ensemble-based engine to mine concept-drifting data streams. In: Proceedings of International Conference on Very Large Data Bases, pp. 1257–1260 (2004)

    Google Scholar 

  12. Chen, S., He, H.: SERA: Selectively recursive approach towards nonstationary imbalanced stream data mining. In: International Joint Conference on Neural Networks, pp. 522–529 (2009)

    Google Scholar 

  13. Chen, S., He, H.: Toward incremental learning of nonstationary imbalanced data stream: A multiple selectively recursive approach. Evolving Systems 2(1), 30–50 (2011)

    Article  Google Scholar 

  14. Elwell, R., Polikar, R.: Incremental learning of concept drift in nonstationary environments. IEEE Trans. Neural Networks 22(10), 1517–1531 (2011)

    Article  Google Scholar 

  15. Shalizi, C., Jacobs, A., Klinkner, K., Clauset, A.: Adapting to non-stationarity with growing expert ensembles, arXiv:1103.09049v2 (2011)

    Google Scholar 

  16. Webb, S., Caverlee, J., Pu, C.: Introducing the webb spam corpus: using email spam to identify web spam automatically. In: Proceedings of Third Conference on Email and Anti-Spam (2006)

    Google Scholar 

  17. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intelligent Systems and Technology 2(3), 1–27 (2011)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yin, XC., Huang, K., Hao, HW. (2013). Dynamic Ensemble of Ensembles in Nonstationary Environments. In: Lee, M., Hirose, A., Hou, ZG., Kil, R.M. (eds) Neural Information Processing. ICONIP 2013. Lecture Notes in Computer Science, vol 8227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42042-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-42042-9_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-42041-2

  • Online ISBN: 978-3-642-42042-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics