Skip to main content

New Specifics for a Hierarchial Estimator Meta-algorithm

  • Conference paper
Artificial Intelligence and Soft Computing (ICAISC 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7268))

Included in the following conference series:

  • 1704 Accesses

Abstract

Hierarchical Estimator is a meta-algorithm presented in [1] concerned with learning a nonlinear relation between two vector variables from training data, which is one of the core tasks of machine learning, primarily for the purpose of prediction. It arranges many simple function approximators into a tree-like structure in order to achieve a solution with a low error.

This paper presents a new version of specifics for that meta-algorithm – a so called training set division and a competence function creation method. The included experimental results show improvement over the methods described in [1]. A short recollection of Hierarchical Estimator is also included.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brodowski, S., Podolak, I.T.: Hierarchical estimator. Expert Systems with Applications 38(10), 12237–12248 (2011)

    Article  Google Scholar 

  2. Brodowski, S.: On mean squared error of hierarchical estimator. Schedae Informaticae 20 (2011) (accepted)

    Google Scholar 

  3. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning, 2nd edn. Springer, New York (2009)

    Book  MATH  Google Scholar 

  4. Haykin, S.: Neural networks, a comprehensive foundation., 3rd edn. Prentice Hall (2009)

    Google Scholar 

  5. Russell, S.J., Norvig, P.: Artificial Intelligence: A Modern Approach (2002)

    Google Scholar 

  6. Hand, D., Mannila, H., Smyth, P.: Principles of Data Mining. MIT Press (2001)

    Google Scholar 

  7. Riedmiller, M., Braun, H.: Rprop - a fast adaptive learning algorithm. In: Proceedings of the International Symposium on Computer and Information Science VII (1992)

    Google Scholar 

  8. Bezdek, J.C., Keller, J.M., Krishnapuram, R., Pal, N.R.: Fuzzy Models and Algorithms for Pattern Recognition and Image Processing. Springer (1999)

    Google Scholar 

  9. Quinlan, J.R.: Learning with continuous classes. In: Proceedings of the 5th Australian Conference on Artificial Intelligence, AI 1992, pp. 343–348. World Scientific (1992)

    Google Scholar 

  10. Freund, Y., Schapire, R.: A decision theoretic generalization of online learning and an application to boosting. Journal of Computer and System Sciences 55, 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  11. Bartlett, P.L., Traskin, M.: Adaboost is consistent. Journal of Machine Learning Research 8, 2347–2368 (2007)

    MathSciNet  MATH  Google Scholar 

  12. Schapire, R.E.: The strength of weak learnability. Machine Learning 5(2), 197–227 (1990)

    Google Scholar 

  13. Tresp, V.: Committee machines. In: Hu, Y.H., Hwang, J.N. (eds.) Handbook for Neural Network Signal Processing. CRC Press (2001)

    Google Scholar 

  14. Jordan, M.I., Jacobs, R.A.: Hierarchical mixtures of experts and the em algorithm. Neural Computation, 181–214 (1994)

    Google Scholar 

  15. Saito, K., Nakano, R.: A constructive learning algorithm for an hme. In: IEEE International Conference on Neural Networks, vol. 3, pp. 1268–1273 (1996)

    Google Scholar 

  16. Podolak, I.T.: Hierarchical classifier with overlapping class groups. Expert Systems with Applications 34(1), 673–682 (2008)

    Article  Google Scholar 

  17. Pal, N., Bezdek, J.: On cluster validity for the fuzzy c-means model. IEEE Transactions on Fuzzy Systems 3(3), 370–379 (1995)

    Article  Google Scholar 

  18. Brodowski, S.: A Validity Criterion for Fuzzy Clustering. In: Jędrzejowicz, P., Nguyen, N.T., Hoang, K. (eds.) ICCCI 2011, Part I. LNCS (LNAI), vol. 6922, pp. 113–122. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  19. Wann, C.-D., Thomopoulos, S.C.A.: A comparative study of self-organizing clustering algorithms dignet and art2. Neural Networks 10(4), 737–753 (1997)

    Article  Google Scholar 

  20. Brodowski, S.: Adaptujący się hierarchiczny aproksymator, MSc thesis (2007) (in polish)

    Google Scholar 

  21. Asuncion, A., Newman, D.: UCI machine learning repository (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Brodowski, S., Bielecki, A. (2012). New Specifics for a Hierarchial Estimator Meta-algorithm. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2012. Lecture Notes in Computer Science(), vol 7268. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29350-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-29350-4_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-29349-8

  • Online ISBN: 978-3-642-29350-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics