Abstract
Prototype selection, as a preprocessing step in machine learning, is effective in decreasing the computational cost of classification task by reducing the number of retained instances. This goal is obtained by shrinking the level of noise and rejecting the irrelevant data. Prototypes may be also used to understand the data through improving comprehensibility of results. In the paper we discus an approach for instance selection based on techniques known from feature selection pointing out the dualism between feature and instance selection. Finally some experiments are shown which uses feature ranking methods for instance selection.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Grochowski, M., Jankowski, N.: Comparison of Instance Selection Algorithms II. Results and Comments. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 580–585. Springer, Heidelberg (2004)
Duch, W., Grudziński, K.: Prototype based rules - new way to understand the data. In: IEEE International Joint Conference on Neural Networks, pp. 1858–1863. IEEE Press, Washington D.C. (2001)
Duch, W., Blachnik, M.: Fuzzy Rule-Based Systems Derived from Similarity to Prototypes. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds.) ICONIP 2004. LNCS, vol. 3316, pp. 912–917. Springer, Heidelberg (2004)
Wilson, D.L.: Asymptotic properties of nearest neighbor rules using edited data. IEEE Trans. Systems, Man and Cybernetics 2, 408–421 (1972)
Hart, P.E.: The condensed nearest neighbor rule. IEEE Transactions on Information Theory 114, 515–516 (1968)
Aha, D., Kibler, D., Albert, M.K.: Instance-based learning algorithms. Machine Learning 6, 37–66 (1991)
Salvador, G., Joaquin, D., Cano, J.R., Herrera, F.: Prototype selection for nearest neighbor classification: Taxonomy and empirical study. IEEE Transactions on Pattern Analysis and Machine Intelligence 34(3), 417–435 (2010)
Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.: Feature extraction, foundations and applications. Springer, Heidelberg (2006)
Duch, W.: Filter methods. In: Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L. (eds.) Feature Extraction, Foundations and Applications, pp. 89–118. Springer, Physica Verlag, Heidelberg, Berlin (2006)
Duch, W., Wieczorek, T., Biesiada, J., Blachnik, M.: Comparision of feature ranking methods based on information entropy. In: Proc. of International Joint Conference on Neural Networks, pp. 1415–1420. IEEE Press, Budapest (2004)
Battiti, R.: Using mutual information for selecting features in supervised neural net learning. IEEE Trans. on Neural Networks 5, 537–550 (1994)
Asuncion, A., Newman, D.:UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Duch, W., Maszczyk, T., Jankowski, N.: Make it cheap: learning with o(nd) complexity. In: 2012 IEEE World Congress on Computational Intelligence, Brisbane, Australia, pp. 132–135 (2012)
Bhattacharya, B.K., Poulsen, R.S., Toussaint, G.T.: Application of proximity graphs to editing nearest neighbor decision rule. In: Proc. Int. Symposium on Information Theory, Santa Monica, CA, pp. 1–25 (1981)
Toussaint, G.T.: The relative neighborhood graph of a finite planar set. Pattern Recognition 12(4), 261–268 (1980)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Blachnik, M., Duch, W., Maszczyk, T. (2012). Feature Ranking Methods Used for Selection of Prototypes. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds) Artificial Neural Networks and Machine Learning – ICANN 2012. ICANN 2012. Lecture Notes in Computer Science, vol 7553. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33266-1_37
Download citation
DOI: https://doi.org/10.1007/978-3-642-33266-1_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33265-4
Online ISBN: 978-3-642-33266-1
eBook Packages: Computer ScienceComputer Science (R0)