Meta-learning via Search Combined with Parameter Optimization

  • Włodzisław Duch
  • Karol Grudziński
Part of the Advances in Soft Computing book series (AINSC, volume 17)


Framework for Similarity-Based Methods (SBMs) allows to create many algorithms that differ in important aspects. Although no single learning algorithm may outperform other algorithms on all data an almost optimal algorithm may be found within the SBM framework. To avoid tedious experimentation a meta-learning search procedure in the space of all possible algorithms is used to build new algorithms. Each new algorithm is generated by applying admissible extensions to the existing algorithms and the most promising are retained and extended further. Training is performed using parameter optimization techniques. Preliminary tests of this approach are very encouraging.


Feature Selection Reference Model Reference Vector Manhattan Distance Euclidean Distance Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Duda, R.O., Hart, P.E and Stork, D.G. (2001): Pattern classification. 2nd ed, John Wiley and Sons, New York (2001)Google Scholar
  2. 2.
    Michalski, R.S. (Ed.) (1993): Multistrategy Learning. Kluwer Academic Publishers.Google Scholar
  3. 3.
    Duch, W., and Grgbczewski, K. (2001): Heterogeneous adaptive systems. World Congress of Computational Intelligence, Honolulu, May 2001 (submitted).Google Scholar
  4. 4.
    Duch, W. and Jankowski N. (2001): Transfer functions: hidden possibilities for better neural networks. 9th European Symposium on Artificial Neural Networks (ESANN), Brugge 2001. De-facto publications, pp. 81–94Google Scholar
  5. 5.
    Michie, D., Spiegelhalter, D. J. and Taylor, C.C. (1994): Machine learning, neural and statistical classification. Elis Horwood, LondonGoogle Scholar
  6. 6.
    Duch, W., Adamczak, R., and Diercksen, G.H.F. (2000): Classification, Association and Pattern Completion using Neural Similarity Based Methods. Applied Mathematics and Computer Science 10, 101–120Google Scholar
  7. 7.
    Duch W. (2000): Similarity-Based Methods. Similarity based methods: a general framework for classification, approximation and association, Control and Cybernetics 29 (4), 937–968.MathSciNetMATHGoogle Scholar
  8. 8.
    Duch W., Grudzinski K. (1999): Search and global minimization in similarity-based methods. In: Int. Joint Conference on Neural Networks (IJCNN), Washington, July 1999, paper no. 742Google Scholar
  9. 9.
    Ingberg, L. (1996): Adaptive simulated annealing (ASA): Lessons learned. J. Control and Cybernetics 25, 33–54Google Scholar
  10. 10.
    Grudziríski, K., Duch, W. (2001): Ensembles of Similarity-Based Models. Inteligent Information Systems 2001, Advances in Soft Computing, Physica Verlag (Springer), pp. 75–85Google Scholar
  11. 11.
    Thrun, S.B. et al. (1991): The MONK’s problems: a performance comparison of different learning algorithms. Carnegie Mellon University, Tech.Rep. CMU-CS-91–197Google Scholar
  12. 12.
    Mertz, C. J. and Murphy, P.M. UCI repository of machine learning datasets, Scholar
  13. 13.
    Hayashi, Y., Imura, A., and Yoshida, K. (1990): Fuzzy neural expert system and its application to medical diagnosis. In: 8th International Congress on Cybernetics and Systems, New York, pp. 54–61Google Scholar
  14. 14.
    Duch, W., Adamczak, R., Grabczewski, K. (2001) A new methodology of extraction, optimization and application of crisp and fuzzy logical rules IEEE Transactions on Neural Networks 12, 277–306CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Włodzisław Duch
    • 1
  • Karol Grudziński
    • 1
  1. 1.Department of InformaticsNicholas Copernicus UniversityToruńPoland

Personalised recommendations