Discrete Quasi-gradient Features Weighting Algorithm

  • Norbert Jankowski
Conference paper
Part of the Advances in Soft Computing book series (AINSC, volume 19)


A new method of feature weighting, useful also for feature extraction has been described. It is quite efficient and gives quite accurate results. Weighting algorithm may be used with any kind of learning algorithm. The weighting algorithm with k-nearest neighbors model was used to estimate the best feature base for a given distance measure. Results obtained with this algorithm clearly show its superior performance in several benchmark tests.


Feature Weighting Weighting Algorithm Glass Data Australian Credit Feature Extraction Tool 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.Google Scholar
  2. 2.
    R. O. Duda, P. E. Hart, and D. G. Stork. Patter Classification and Scene Analysis. Wiley, 2nd ed. edition, 1997.Google Scholar
  3. 3.
    M. Dash and H. Liu. Feature selection for classification. Intelligent Data Analysis, 1 (3), 1997.Google Scholar
  4. 4.
    M. Fernandez and C. Hernandez. How to select the inputs for a multilayer feedforward by using the training set. In 5th International Work Conference on Artificial an Natural Neural Networks, pages 477–486, Alicante, Spain, 1999.Google Scholar
  5. 5.
    H. Almuallim and T. G. Dietterich. Efficient algorithms for identifying relevant features. In Proceedings of the Ninth Canadian Conference on Artificial Intelligence,pages 38–45, Vancouver, 1992. Morgan Kaufmann.Google Scholar
  6. 6.
    D. R. Wilson and T. R. Martinez. Instance-based learning with genetically derived attribute weights. In International Conference on Artificial Intelligence, Expert Systems and Neural Networks, pages 11–14, 1996.Google Scholar
  7. 7.
    D. R. Wilson. Advances in Instance-Based Learning Algorithms. PhD thesis, Department of Computer Science Brigham Young University, 1997.Google Scholar
  8. 8.
    T. M. Cover and P. E. Hart. Nearest neighbor pattern classification. Institute of Electrical and Electronics Engineers Transactions on Information Theory,13(1):21–27, jan 1967.Google Scholar
  9. 9.
    C. J. Merz and P. M. Murphy. UCI repository of machine learning databases, 1998.http://www.ics.uci.edu/’mlearn/MLRepository.html.
  10. 10.
    W. Duch and K. Grudzinski. Search and global minimization in similarity-based methods. In International Joint Conference on Neural Networks, page 742, Washington, 1999.Google Scholar
  11. 11.
    W. Duch, R. Adamczak, and K. Grabczewski. Extraction of logical rules from backpropagation networks. Neural Processing Letters, 7: 1–9, 1998.CrossRefGoogle Scholar
  12. 12.
    S.M. Weiss and I. Kapouleas. An empirical comparison of pattern recognition, neural nets and machine learning classification methods. In J.W. Shavlik and T.G. Dietterich, editors, Readings in Machine Learning. Morgan Kauffman, 1990.Google Scholar
  13. 13.
    K. Grabczewski and Wlodzislaw Duch. The separability of split value criterion. In L. Rutkowski and R. Tadeusiewicz, editors, Neural Networks and Soft Computing, pages 202–208, Zakopane, Poland, June 2000.Google Scholar
  14. 14.
    N. Jankowski. Ontogenic neural networks and their applications to classification of medical data. PhD thesis, Department of Computer Methods, Nicholas Copernicus University, Torurí, Poland, 1999.Google Scholar
  15. 15.
    W. Schiffman, M. Joost, and R. Werner. Comparison of optimized backpropagation algorithms. In Proceedings of ESANN’93, pages 97–104, Brussels, 1993.Google Scholar
  16. 16.
    Frederick Zarndt. A comprehensive case study: An examination of machine learning and connectionist algorithms. Master’s thesis, Department of Computer Science Brigham Young University, 1995.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Norbert Jankowski
    • 1
  1. 1.Department of InformaticsNicholas Copernicus UniversityToruńPoland

Personalised recommendations