Discrete Quasi-gradient Features Weighting Algorithm
A new method of feature weighting, useful also for feature extraction has been described. It is quite efficient and gives quite accurate results. Weighting algorithm may be used with any kind of learning algorithm. The weighting algorithm with k-nearest neighbors model was used to estimate the best feature base for a given distance measure. Results obtained with this algorithm clearly show its superior performance in several benchmark tests.
KeywordsFeature Weighting Weighting Algorithm Glass Data Australian Credit Feature Extraction Tool
Unable to display preview. Download preview PDF.
- 1.C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.Google Scholar
- 2.R. O. Duda, P. E. Hart, and D. G. Stork. Patter Classification and Scene Analysis. Wiley, 2nd ed. edition, 1997.Google Scholar
- 3.M. Dash and H. Liu. Feature selection for classification. Intelligent Data Analysis, 1 (3), 1997.Google Scholar
- 4.M. Fernandez and C. Hernandez. How to select the inputs for a multilayer feedforward by using the training set. In 5th International Work Conference on Artificial an Natural Neural Networks, pages 477–486, Alicante, Spain, 1999.Google Scholar
- 5.H. Almuallim and T. G. Dietterich. Efficient algorithms for identifying relevant features. In Proceedings of the Ninth Canadian Conference on Artificial Intelligence,pages 38–45, Vancouver, 1992. Morgan Kaufmann.Google Scholar
- 6.D. R. Wilson and T. R. Martinez. Instance-based learning with genetically derived attribute weights. In International Conference on Artificial Intelligence, Expert Systems and Neural Networks, pages 11–14, 1996.Google Scholar
- 7.D. R. Wilson. Advances in Instance-Based Learning Algorithms. PhD thesis, Department of Computer Science Brigham Young University, 1997.Google Scholar
- 8.T. M. Cover and P. E. Hart. Nearest neighbor pattern classification. Institute of Electrical and Electronics Engineers Transactions on Information Theory,13(1):21–27, jan 1967.Google Scholar
- 9.C. J. Merz and P. M. Murphy. UCI repository of machine learning databases, 1998.http://www.ics.uci.edu/’mlearn/MLRepository.html.
- 10.W. Duch and K. Grudzinski. Search and global minimization in similarity-based methods. In International Joint Conference on Neural Networks, page 742, Washington, 1999.Google Scholar
- 12.S.M. Weiss and I. Kapouleas. An empirical comparison of pattern recognition, neural nets and machine learning classification methods. In J.W. Shavlik and T.G. Dietterich, editors, Readings in Machine Learning. Morgan Kauffman, 1990.Google Scholar
- 13.K. Grabczewski and Wlodzislaw Duch. The separability of split value criterion. In L. Rutkowski and R. Tadeusiewicz, editors, Neural Networks and Soft Computing, pages 202–208, Zakopane, Poland, June 2000.Google Scholar
- 14.N. Jankowski. Ontogenic neural networks and their applications to classification of medical data. PhD thesis, Department of Computer Methods, Nicholas Copernicus University, Torurí, Poland, 1999.Google Scholar
- 15.W. Schiffman, M. Joost, and R. Werner. Comparison of optimized backpropagation algorithms. In Proceedings of ESANN’93, pages 97–104, Brussels, 1993.Google Scholar
- 16.Frederick Zarndt. A comprehensive case study: An examination of machine learning and connectionist algorithms. Master’s thesis, Department of Computer Science Brigham Young University, 1995.Google Scholar