Automatic Construction of Decision Trees and Neural Nets for Classification Using Statistical Considerations

  • F. Wysotzki
  • W. Müller
  • B. Schulmeister
Conference paper
Part of the International Centre for Mechanical Sciences book series (CISM, volume 382)


Two algorithms for supervised learning of classifications are discussed from the point of view of the usefulness of including statistical methods. It will be demonstrated that statistical considerations of very general nature (i.e. without assumptions on class distributions) can lead to substantial improvements of the learning procedure and the constructed classifiers. The decision tree learner CAL5 converts real-valued attributes into discrete-valued ones the number of which is not restricted to two. Pruning occurs during tree construction. The hybrid (statistical/neural) algorithm DIPOL solves the problem of choosing the initial architecture and initial weights by statistical methods and replaces additional hidden layers by a Boolean decision function. Both algorithms are also discussed within the framework of the ESPRIT-Project StatLog where about 20 of the most important procedures for classification learning are compared using statistical criteria.


Decision Tree Boolean Function Supervise Learning Terminal Node Stochastic Approximation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1. [MST94]
    Michie, D., Spiegelhalter, D.J. and Taylor, C.C. (Eds.): Machine Learning, Neural and Statistical Classification, Ellis Norwood, New York 1994.Google Scholar
  2. 2. [MP69]
    Minsky, M. and Papert, S.: Perceptrons, MIT Press, Cambridge, MA, 1969.Google Scholar
  3. 3. [MW94]
    Müller, W., and Wysotzki, F.: Automatic Construction of Decision Trees for Classification, Annals of Operations Research, 52 (1994), pp. 231–247.CrossRefzbMATHGoogle Scholar
  4. 4. [MW96]
    Müller, W., Wysotzki, F.: The Decision Tree Algorithm CAL5 Based on a Statistical Approach to its Splitting Algorithm, in: Nakhaeizadeh, G. and Taylor, C.C. (Eds.), Machine Learning and Statistics, The Interface. John Wiley Sons, New York, 1996.Google Scholar
  5. 5. [N65]
    Nilsson, N.: Learning Machines. McGraw-Hill, New York 1965.zbMATHGoogle Scholar
  6. 6. [SW96]
    Schulmeister, B. and Wysotzki, F.: DIPOL - A Hybrid Piecewise Linear Classifier, in: Nakhaeizadeh, G. and Taylor, C.C. (Eds.), Machine Learning and Statistics, The Interface. John Wiley Sons, New York 1996.Google Scholar
  7. 7. [UW81]
    Unger, S. and Wysotzki, F.: Lernfähige Klassifizierungs- systeme, Akademieverlag, Berlin 1981.Google Scholar
  8. 8. [W62]
    Widrow, B.: Generalization and Information Storage in Networks of ADALINE “Neurons”, in: Self Organizing Systems ( M. C. Yovits, G. T. Jacoby, and G. D. Goldstein, eds), Spartan Books, Washington, D.C. 1962, pp. 435–461.Google Scholar
  9. 9. [We96]
    Werk, R.: Untersuchungen zur Korrektur und zum inkrementellen Lernen von Klassifikationen durch den hybriden Algorithmus DIPOL, Diplomarbeit, Dept. of Computer Science, Technical University of Berlin 1996.Google Scholar

Copyright information

© Springer-Verlag Wien 1997

Authors and Affiliations

  • F. Wysotzki
    • 1
  • W. Müller
    • 2
  • B. Schulmeister
    • 2
  1. 1.Technical University of BerlinBerlinGermany
  2. 2.Fraunhofer-GesellschaftBerlinGermany

Personalised recommendations