Skip to main content

Universal Learning Machines

  • Conference paper
Book cover Neural Information Processing (ICONIP 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5864))

Included in the following conference series:

Abstract

All existing learning methods have particular bias that makes them suitable for specific kind of problems. Universal Learning Machine (ULM) should find the simplest data model for arbitrary data distributions. Several ways to create ULMs are outlined, and an algorithm based on creation of new global and local features combined with meta-learning is introduced. This algorithm is able to find simple solutions that sophisticated algorithms ignore, learn complex Boolean functions, complicated probability distributions, as well as the problems requiring multiresolution decision borders.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Schölkopf, B., Smola, A.: Learning with Kernels. Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2001)

    Google Scholar 

  2. Duch, W., Maszczyk, T.: Almost random projection machine. In: Alippi, C., et al. (eds.) ICANN 2009, Part I. LNCS, vol. 5768, pp. 789–798. Springer, Heidelberg (2009)

    Google Scholar 

  3. Duch, W.: k-separability. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 188–197. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  4. Duch, W., Grudziński, K.: Meta-learning: searching in the model space. In: Proceedings of the International Conference on Neural Information Processing, Shanghai, pp. 235–240 (2001)

    Google Scholar 

  5. Duch, W., Grudziński, K.: Meta-learning via search combined with parameter optimization. In: Rutkowski, L., Kacprzyk, J. (eds.) Advances in Soft Computing, pp. 13–22. Springer-Physica Verlag, New York (2002)

    Google Scholar 

  6. Duda, R.O., Hart, P.E., Stork, D.: Patter Classification. J. Wiley & Sons, New York (2001)

    Google Scholar 

  7. Duch, W., Adamczak, R., Gra̧bczewski, K.: A new methodology of extraction, optimization and application of crisp and fuzzy logical rules. IEEE Transactions on Neural Networks 12, 277–306 (2001)

    Article  Google Scholar 

  8. Duch, W., Setiono, R., Zurada, J.: Computational intelligence methods for understanding of data. Proceedings of the IEEE 92, 771–805 (2004)

    Article  Google Scholar 

  9. Grochowski, M., Duch, W.: Learning highly non-separable Boolean functions using Constructive Feedforward Neural Network. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D.P. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 180–189. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  10. Brazdil, P., Giraud-Carrier, C., Soares, C., Vilalta, R.: Metalearning: Applications to Data Mining. In: Cognitive Technologies. Springer, Heidelberg (2009)

    Google Scholar 

  11. Duch, W., Jankowski, N.: Survey of neural transfer functions. Neural Computing Surveys 2, 163–213 (1999)

    Google Scholar 

  12. Duch, W., Jankowski, N.: Taxonomy of neural transfer functions. In: International Joint Conference on Neural Networks, Como, Italy, vol. III, pp. 477–484. IEEE Press, Los Alamitos (2000)

    Google Scholar 

  13. Duch, W.: Gra̧bczewski, K.: Heterogeneous adaptive systems. In: IEEE World Congress on Computational Intelligence, pp. 524–529. IEEE Press, Honolulu (2002)

    Google Scholar 

  14. Duch, W.: Towards comprehensive foundations of computational intelligence. In: Duch, W., Mandziuk, J. (eds.) Challenges for Computational Intelligence, vol. 63, pp. 261–316. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  15. Duch, W., Itert, L.: Committees of undemocratic competent models. In: Rutkowski, L., Kacprzyk, J. (eds.) Proc. of Int. Conf. on Artificial Neural Networks (ICANN), Istanbul, pp. 33–36 (2003)

    Google Scholar 

  16. Grabczewski, K., Jankowski, N.: Versatile and efficient meta-learning architecture: Knowledge representation and management in computational intelligence. In: IEEE Symposium on Computational Intelligence in Data Mining, pp. 51–58. IEEE Press, Los Alamitos (2007)

    Chapter  Google Scholar 

  17. Grabczewski, K., Jankowski, N.: Meta-learning with machine generators and complexity controlled exploration. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2008. LNCS (LNAI), vol. 5097, pp. 545–555. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  18. Duch, W.: Filter methods. In: Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L. (eds.) Feature extraction, foundations and applications, pp. 89–118. Springer-Physica Verlag, New York (2006)

    Google Scholar 

  19. Grochowski, M., Duch, W.: Projection Pursuit Constructive Neural Networks Based on Quality of Projected Clusters. In: Kůrková, V., Neruda, R., Koutník, J. (eds.) ICANN 2008, Part II. LNCS, vol. 5164, pp. 754–762. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  20. Duch, W., Adamczak, R., Diercksen, G.: Classification, association and pattern completion using neural similarity based methods. Applied Mathematics and Computer Science 10, 101–120 (2000)

    Google Scholar 

  21. Asuncion, A., Newman, D.: UCI machine learning repository (2007)

    Google Scholar 

  22. Gra̧bczewski, K., Duch, W.: The separability of split value criterion. In: Proceedings of the 5th Conf. on Neural Networks and Soft Computing, Zakopane, Poland, Polish Neural Network Society, pp. 201–208 (2000)

    Google Scholar 

  23. Duch, W., Jankowski, N., Gra̧bczewski, K., Naud, A., Adamczak, R.: Ghostminer data mining software. Technical report, Department of Informatics, Nicolaus Copernicus University (2000-2008), http://www.fqspl.com.pl/ghostminer/

  24. Michie, D., Spiegelhalter, D.J., Taylor, C.C.: Machine learning, neural and statistical classification. Elis Horwood, London (1994)

    MATH  Google Scholar 

  25. Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.: Feature extraction, foundations and applications. Springer-Physica Verlag, Heidelberg (2006)

    Google Scholar 

  26. Liu, H., Motoda, H.: Feature extraction, construction and selection: a data mining perspective. In: Liu, H., Motoda, H. (eds.) SECS, vol. 453. Kluwer Academic, Boston (1998) (Includes bibliographical references and index)

    Google Scholar 

  27. Pȩkalska, E., Duin, R.: The dissimilarity representation for pattern recognition: foundations and applications. World Scientific, Singapore (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Duch, W., Maszczyk, T. (2009). Universal Learning Machines. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5864. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10684-2_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-10684-2_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-10682-8

  • Online ISBN: 978-3-642-10684-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics