Advertisement

Prototype Selection with Compact Sets and Extended Rough Sets

  • Yenny Villuendas-Rey
  • Yailé Caballero-Mota
  • María Matilde García-Lorenzo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7637)

Abstract

In this paper, we propose a generalization of classical Rough Sets, the Nearest Neighborhood Rough Sets, by modifying the indiscernible relation without using any similarity threshold. We also combine these Rough Sets with Compact Sets, to obtain a prototype selection algorithm for Nearest Prototype Classification of mixed and incomplete data as well as arbitrarily dissimilarity functions. We introduce a set of rules to a priori predict the performance of the proposed prototype selection algorithm. Numerical experiments over repository databases show the high quality performance of the method proposed in this paper according to classifier accuracy and object reduction.

Keywords

prototype selection compact sets rough sets 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    García-Osorio, C., Haro-García, A., Gaecía-Pedrajas, N.: Democratic Instance Selection A Linear Complexity Instance Selection Algorithm Based on Classifier Ensemble Concepts. Artificial Intelligence 174(5-6), 410–441 (2010)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Nikolaidis, K., Goulemas, J.Y., Wu, Q.H.: A Class Boundary Preserving Algorithm for Data Condensation. Pattern Recognition 44(3), 704–715 (2011)MATHCrossRefGoogle Scholar
  3. 3.
    Zafra, A., Gibaja, E.L., Ventura, S.: Multiple Instance Learning with Multiple Objective Genetic Programming for Web Mining. Applied Soft Computing 11(1), 93–102 (2011)CrossRefGoogle Scholar
  4. 4.
    Triguero, I., Derrac, J., García, S., Herrera, F.: A Taxonomy and Experimental Study on Prototype Generation for Nearest Neighbor Classification. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 42(1), 86–100 (2012)CrossRefGoogle Scholar
  5. 5.
    Caballero, Y., Bello, R., Salgado, Y., García, M.M.: A Method to Edit Training Set Based on Rough Sets. Intl. Journal of Computational Intelligence Research 3(3), 219–229 (2007)Google Scholar
  6. 6.
    Hu, Q., Yu, D., Liu, J., Wu, C.: Neighborhood Rough Sets Based Heterogeneous Feature Selection. Information Sciences 178(18), 3577–3594 (2008)MathSciNetMATHCrossRefGoogle Scholar
  7. 7.
    Pawlak, Z.: Rough Sets. Intl. Journal of Parallel Programming 11(5), 341–356 (1982)MathSciNetMATHGoogle Scholar
  8. 8.
    Slowinski, R., Vanderpooten, D.: A Generalized Definition of Rough Approximations Based on Similarity. IEEE Trans. on Knowledge and Data Eng. 12(2), 331–336 (2000)CrossRefGoogle Scholar
  9. 9.
    Ruiz-Shulcloper, J., Guzmán-Arenas, A., Martinez Trinidad, J.F.: Logical Combinatorial Approach to Pattern Recognition Feature Selection and Supervised Classification. Editorial Politécnica, Mexico (2000)Google Scholar
  10. 10.
    Martínez Trinidad, J.F., Guzmán-Arenas, A.: The Logical Combinatorial Approach to Pattern Recognition An Overview through Selected Works. Pattern Recognition 34(4), 741–751 (2001)MATHCrossRefGoogle Scholar
  11. 11.
    Ruiz-Shulcloper, J., Abidi, M.A.: Logical Combinatorial Pattern Recognition A Review. In: Pandalai, S.G. (ed.) Recent Research Developments in Pattern Recognition, pp. 133–176. Transword Research Networks, USA (2002)Google Scholar
  12. 12.
    García-Borroto, M., Villuendas-Rey, Y., Carrasco-Ochoa, J.A., Martínez-Trinidad, J.F.: Using Maximum Similarity Graphs to Edit Nearest Neighbor Classifiers. In: Bayro-Corrochano, E., Eklundh, J.-O. (eds.) CIARP 2009. LNCS, vol. 5856, pp. 489–496. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  13. 13.
    Wolpert, D.H., MacReady, W.G.: No Free Lunch Theorems for Optimization. IEEE Transactions on Evolutionary Computation 1(1), 67–82 (1997)CrossRefGoogle Scholar
  14. 14.
    Dasarathy, B.V., Sanchez, J.S., Townsend, S.: Nearest Neighbour Editing and Condensing Tools Synergy Exploitation. Pattern Analysis & Applications 3(1), 19–30 (2000)CrossRefGoogle Scholar
  15. 15.
    Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Science. University of California at Irvine, Irvine (1998)Google Scholar
  16. 16.
    Chou, C.H., Kuo, B.A., Cheng, F.: The Generalized Condensed Nearest Neighbor Rule as a Data Reduction Technique. In: 18th International Conference on Pattern Recognition (ICPR 2006), vol. 2, pp. 556–559. IEEE (2006)Google Scholar
  17. 17.
    García-Borroto, M., Villuendas-Rey, Y., Carrasco-Ochoa, J.A., Martínez-Trinidad, J.F.: Finding Small Consistent Subset for the Nearest Neighbor Classifier Based on Support Graphs. In: Bayro-Corrochano, E., Eklundh, J.-O. (eds.) CIARP 2009. LNCS, vol. 5856, pp. 465–472. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  18. 18.
    Wilson, R.D., Martinez, T.R.: Improved Heterogeneous Distance Functions. Journal of Artificial Intelligence Research 6, 1–34 (1997)MathSciNetMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Yenny Villuendas-Rey
    • 1
    • 3
  • Yailé Caballero-Mota
    • 2
  • María Matilde García-Lorenzo
    • 3
  1. 1.Computer Science DepartmentUniversity of Ciego de ÁvilaCuba
  2. 2.Computer Science DepartmentUniversity of CamagüeyCuba
  3. 3.Computer Science DepartmentUniversity of Las VillasCuba

Personalised recommendations