Incremental Learning From Symbolic Objects

  • Michelle Sebag
  • Edwin Diday
  • Marc Schoenauer
Conference paper
Part of the NATO ASI Series book series (volume 61)

Summary

This paper deals with discrimination on real-world data when examples as well as partial rules are initially available. As the learning base is both noisy and insufficiently representative, our approach is oriented towards successive approximations of a discriminant rule base; the aim is to predict conclusions according to further examples.

A 2-step iterative process is presented:
  1. 1.

    A sub-optimal generalization leads to an approximative rule base including redundancy (strongly overlapping, rules) and errors.

     
  2. 2.

    A change of representation called reduction transforms the refinement of the previous rule base into a new learning problem. Iteration of the generalization then defines a second rule set, refining and correcting the previous one, and so on.

     
First results on a. well-studied learning base are detailed.

Keywords

Coherence Smoke 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. BHATNAGAR, R.K., and KANAL, L.N. (1986), Handling uncertain information: A review of numeric and non-numeric methods, in: Uncertainty in Artificial Intelligence, eds. L.N. Kanal and J.F. Lemmer, North-Holland, Amsterdam.Google Scholar
  2. BRITO, P., and DIDAY, E. (1990), Pyramidal representation of symbolic objects, this volume.Google Scholar
  3. CESTNIK, B., BRATKO, I., and KONONENKO, I. (1987), ASSISTANT 86: A knowledge elicitatión tool for sophisticated users, in: Progress in Machine Learning, Proceedings of EWSL 87, eds. I. Bratko and N. Lavrac.Google Scholar
  4. CLARK, P., and NIBLETT, T. (1987), Induction in noisy domains, in: Progress in Machine Learning, Proceedings of EWSL 87, eds. I. Bratko and N. LavracGoogle Scholar
  5. DAVIS, L, and STEENSTRUP, M. (1987), Genetic Algorithms and Simulated Annealing: An Overview, in: Genetic Algorithms and Simulated Annealing, ed. L. Davis, Pitman, London.Google Scholar
  6. DIDAY, E. (1990), Knowledge representation and symbolic data analysis, this volume.Google Scholar
  7. FALKENHAIMER, B.C., and MICHALSKI, R.S. (1986), Integrating quantitative and qualitative discovery in the ABACUS system, Machine Learning, April.Google Scholar
  8. HOLLAND, J.H., HOLYOAK, K.J., NISBETT, R.E., and THAGARD, P.R. (1985), Classifier Systems, Q-morphisms, and Induction, in: Genetic Algorithms and Simulated Annealing, ed. L. Davis, Pitman, London.Google Scholar
  9. KODRATOFF, Y. (1988), Introduction to Machine Learning, Pitman Eds.Google Scholar
  10. KODRATOFF, Y., and LOISEL, R. (1984), Learning complex structural descriptions from examples, Computer vision, graphics and image processing, 27.Google Scholar
  11. LANGLEY, P., BRADSHAW, G., and SIMON, H. (1984), Rediscovering chemistry with the BACON system, in: Machine Learning: An Artificial Intelligence Approach, eds. R.S. Michalski, J.G. Carbonnel and T.M. Mitchell, Palo Alto.Google Scholar
  12. LEBOWITZ, M. (1985), Categorizing Numeric Information for Generalization, Cognitive Science, 9, 285–308.CrossRefGoogle Scholar
  13. LERMAN, I.-C., LEBBE, J., NICOLAS, J., PETER, P., and VIGNES, R. (1989), Conceptual clustering in biology: applications and perspectives, in: Data Analysis, Learning symbolic and numeric knowledge, ed. E. Diday, Nova Science, Antibes.Google Scholar
  14. MICHALSKI, R.S. (1984), A theory and methodology for inductive learning, in: Machine Learning: An Artificial Intelligence Approach, eds. R.S. Michalski, J.G. Carbonnel and T.M. Mitchell, Palo Alto.Google Scholar
  15. MICHALSKI, R.S., DIDAY, E., and STEPP, R.E. (1981), A recent advance in Data Analysis: clustering objects into classes characterized by conjunctive concepts, in: Progress in Pattern Recognition, eds. L.N. Kanal and Rosenfled, North-Holland, Amsterdam.Google Scholar
  16. MICHALSKI, R.S., MOZETIC, I., HONG, J., and LAVRAC, N. (1986), The AQ15 inductive learning system: an overview and experiments, Proceedings of IMAL, Orsay.Google Scholar
  17. MITCHELL, T.M. (1982), Generalization as Search, Artificial Intelligence, 18.Google Scholar
  18. PERRON, M.-C. (1989), Learning differential diagnosis rules from numerical knowledge, in: Data Analysis, Learning symbolic and numeric knowledge, ed. E. Diday, Nova Science, Antibes.Google Scholar
  19. QUINLAN, J.R. (1986), Induction of decision trees, Machine Learning, 1, 81–106.Google Scholar
  20. SEBAG, M. (1990), Apprentissage multi-couches: une approche symbolique numérique pour la discrimination à partir d’examples et de règles, Thèse d’Université, Paris-IX Dauphine.Google Scholar
  21. SEBAG, M., and SCHOENAUER, M. (1988), Generation of rules with certainty and confidence factors from incomplete and incoherent learning bases, Proceedings of EKAW 88, Bonn.Google Scholar
  22. SEBAG, M., and SCHOENAUER, M. (1989a), Iterative learning and redundant generalizations, in: Data Analysis, Learning numerical and symbolic knowledge, ed. E. Diday, Nova Science, Antibes.Google Scholar
  23. SEBAG, M., and SCHOENAUER, M. (1989b), Learning rules and meta-rules: towards an incremental learning, Rapport interne, Ecole Polytechnique.Google Scholar
  24. WILLE, R. (1981), Restructuring lattice theory: an approach based on hierarchies of concepts, Proceedings of the symposium on ordered sets, ed. I. Rival.Google Scholar
  25. WINSTON, P.H. (1975), Learning structural descriptions from examples, in: The Psychology of Computer Vision, ed. P.H. Winston, McGraw-Hill, New York.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1990

Authors and Affiliations

  • Michelle Sebag
    • 1
  • Edwin Diday
    • 2
  • Marc Schoenauer
    • 3
  1. 1.Centre de Mathématiques AppliquéesFrance
  2. 2.Université Paris-IX Dauphine and INRIALe ChesnayFrance
  3. 3.Centre de Mathématiques AppliquéesEcole PolytechniquePalaiseauFrance

Personalised recommendations