Advertisement

Using lattice-based framework as a tool for feature extraction

  • Engelbert Mephu Nguifo
  • Patrick Njiwoua
Instance Based Learning
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1398)

Abstract

Feature transformation (FT) is one of the way to preprocess data in order to improve classification efficiency. Different FT approaches was intensively studied this past years. These approaches wish to give to a learner only those attributes that are relevant to the target concept. This paper presents a process that extracts a set of new numerical features from the original set of boolean features through the use of an empirical mapping function. This mapping is based on the use of an entropical function to learn knowledge over the Galois semi-lattice construction of the initial set of objects. One advantage here is the reduction of effect of possible irrelevant features. This process allows to design an Instance-based Learning system, IGLUE which uses the Mahanalobis measure. A comparison is done with other ML systems, in terms of classification accuracy and running time on some real-world and artificial datasets.

Keywords

Entropy Function Irrelevant Attribute Binary Feature Artificial Dataset Feature Extraction Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. [AG96]
    A.D. and H. Gùvenir. D. Nearest Neighbor Classification on Feature Projections. In Proceedings of the Thirteenth International Conference (ICML' 96), pages 12–19, Bari, Italy, July 3–6 1996.Google Scholar
  2. [KS96]
    D. Koller and M. Sahami. Toward Optimal Feature Selection. In Proceedings of the Thirteenth International Conference (ICML' 96), pages 284–292, Bari, Italy, July 3–6 1996.Google Scholar
  3. [MM96]
    C.J. Merz and P.M. Murphy. Uci repository of machine learning databases, http://www.ics.uci.edu/≈mlearn/mlrepository.html, 1996.Google Scholar
  4. [MN94]
    E. Mephu Nguifo. IEEE Press.Google Scholar
  5. [NMN97]
    P. Njiwoua and E. Mephu Nguifo. IGLUE: An Instance-based Learning System over Lattice Theory. In Proceedings of ICTAI'97, pages 75–76, Newport Beach, California, 3–8 November 1997.Google Scholar
  6. [QS82]
    J. Quinqueton and J. Sallatin. Expansion and compression on binary data to build feature data. In Proc. of Intl. Conf. on Pattern Recognition, 1982.Google Scholar
  7. [Qui86]
    J. Quinlan. Induction of Decisions Trees. In Machine Learning, volume 1, pages 81–106. T.M. Mitchell & eds., 1986.Google Scholar
  8. [Sal91]
    S. Salzberg. Distance metrics for instance-based learning. In Proc. of 6th Intl. Symp. Methodologies for Intelligent Systems, pages 399–408, 1991.Google Scholar
  9. [Zhe96]
    Z. Zheng. Effects of Different Types of New Attributes on Constructive Induction. In Proceedings of 8th Intl. Conf. on Tools with AI, TAI-96, pages 254–257, 1996.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Engelbert Mephu Nguifo
    • 1
  • Patrick Njiwoua
    • 1
  1. 1.C.R.I.L, I.U.T de Lens, Université d'Artois, Rue de l'Université SP-16Lens CedexFrance

Personalised recommendations