Skip to main content

Discrete Quasi-gradient Features Weighting Algorithm

  • Conference paper
Neural Networks and Soft Computing

Part of the book series: Advances in Soft Computing ((AINSC,volume 19))

  • 491 Accesses

Abstract

A new method of feature weighting, useful also for feature extraction has been described. It is quite efficient and gives quite accurate results. Weighting algorithm may be used with any kind of learning algorithm. The weighting algorithm with k-nearest neighbors model was used to estimate the best feature base for a given distance measure. Results obtained with this algorithm clearly show its superior performance in several benchmark tests.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.

    Google Scholar 

  2. R. O. Duda, P. E. Hart, and D. G. Stork. Patter Classification and Scene Analysis. Wiley, 2nd ed. edition, 1997.

    Google Scholar 

  3. M. Dash and H. Liu. Feature selection for classification. Intelligent Data Analysis, 1 (3), 1997.

    Google Scholar 

  4. M. Fernandez and C. Hernandez. How to select the inputs for a multilayer feedforward by using the training set. In 5th International Work Conference on Artificial an Natural Neural Networks, pages 477–486, Alicante, Spain, 1999.

    Google Scholar 

  5. H. Almuallim and T. G. Dietterich. Efficient algorithms for identifying relevant features. In Proceedings of the Ninth Canadian Conference on Artificial Intelligence,pages 38–45, Vancouver, 1992. Morgan Kaufmann.

    Google Scholar 

  6. D. R. Wilson and T. R. Martinez. Instance-based learning with genetically derived attribute weights. In International Conference on Artificial Intelligence, Expert Systems and Neural Networks, pages 11–14, 1996.

    Google Scholar 

  7. D. R. Wilson. Advances in Instance-Based Learning Algorithms. PhD thesis, Department of Computer Science Brigham Young University, 1997.

    Google Scholar 

  8. T. M. Cover and P. E. Hart. Nearest neighbor pattern classification. Institute of Electrical and Electronics Engineers Transactions on Information Theory,13(1):21–27, jan 1967.

    Google Scholar 

  9. C. J. Merz and P. M. Murphy. UCI repository of machine learning databases, 1998.http://www.ics.uci.edu/’mlearn/MLRepository.html.

  10. W. Duch and K. Grudzinski. Search and global minimization in similarity-based methods. In International Joint Conference on Neural Networks, page 742, Washington, 1999.

    Google Scholar 

  11. W. Duch, R. Adamczak, and K. Grabczewski. Extraction of logical rules from backpropagation networks. Neural Processing Letters, 7: 1–9, 1998.

    Article  Google Scholar 

  12. S.M. Weiss and I. Kapouleas. An empirical comparison of pattern recognition, neural nets and machine learning classification methods. In J.W. Shavlik and T.G. Dietterich, editors, Readings in Machine Learning. Morgan Kauffman, 1990.

    Google Scholar 

  13. K. Grabczewski and Wlodzislaw Duch. The separability of split value criterion. In L. Rutkowski and R. Tadeusiewicz, editors, Neural Networks and Soft Computing, pages 202–208, Zakopane, Poland, June 2000.

    Google Scholar 

  14. N. Jankowski. Ontogenic neural networks and their applications to classification of medical data. PhD thesis, Department of Computer Methods, Nicholas Copernicus University, Torurí, Poland, 1999.

    Google Scholar 

  15. W. Schiffman, M. Joost, and R. Werner. Comparison of optimized backpropagation algorithms. In Proceedings of ESANN’93, pages 97–104, Brussels, 1993.

    Google Scholar 

  16. Frederick Zarndt. A comprehensive case study: An examination of machine learning and connectionist algorithms. Master’s thesis, Department of Computer Science Brigham Young University, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jankowski, N. (2003). Discrete Quasi-gradient Features Weighting Algorithm. In: Rutkowski, L., Kacprzyk, J. (eds) Neural Networks and Soft Computing. Advances in Soft Computing, vol 19. Physica, Heidelberg. https://doi.org/10.1007/978-3-7908-1902-1_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-7908-1902-1_26

  • Publisher Name: Physica, Heidelberg

  • Print ISBN: 978-3-7908-0005-0

  • Online ISBN: 978-3-7908-1902-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics