A Comparison between Neural Networks and Decision Trees

  • Carsten Jacobsen
  • Uwe Zscherpel
  • Petra Perner
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1715)


In the paper, we empirical compare the performance of neural nets and decision trees based on a data set for the detection of defects in welding seams. The data set was created by image feature extraction procedures working on x-ray images. We consider our data set as highly complex and containing imprecise and uncertain data’s. We explain how the data set was created and what kinds of features were extracted from the images. Then, we explain what kind of neural nets and induction of decision trees were used for classification. We introduce a framework for distinguishing classification methods. We observed that the performance of neural nets is not significant better than the performance of decision trees if we are only looking for the overall error rate. We found that more detailed analysis of the error rate is necessary in order to judge the performance of the learning and classification method. However, the error rate can not be the only criteria for the comparison between the different learning methods. It is a more complex selection process that involves more criteria’s that we describe in the paper.


Decision Tree Radial Basis Function Welding Seam Radial Basis Function Network Explanation Capability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    D. Michie, D.J. Spiegelhalter, and C.C. Taylor, Machine Leaning, Neural Nets and Statistical Classsification, Ellis Horwood Series in Artificial Intelligence, 1994Google Scholar
  2. 2.
    R. Klette and P. Zamperoni, Handbuch der Operatoren für die Bildbearbeitung, Vieweg Verlagsgesellschaft, 1992Google Scholar
  3. 3.
    K.W. Pratt, Digital Image Processing, John Wiley & Sons, Inc. New York Chichester Brisbane Toronto Singapore, 1991zbMATHGoogle Scholar
  4. 4.
    N. Eua-Anant, I. Elshafiey, Upda, L., Gray, J. N.: A Novel Image Processing Algorithm for Enhancing the Probability of Detection of Flaws in X­Ray Images, Review of Progress in Quantitative Nondestructive Evaluation, Vol. 15, Plenum Press, New York, 1996, pp.: 903–910Google Scholar
  5. 5.
    Zscherpel, U., auNockemann C., Mattis A., Heinrich W.: Neue Entwicklungen bei der Filmdigitalisierung, DGZfP-Jahrestagung in Aachen, Tagungsband, 1995Google Scholar
  6. 6.
    Strang, G.: Wavelets and Dilation Equations: A Brief Introduction, Sam Review 31, 1989, pp.: 613–627MathSciNetGoogle Scholar
  7. 7.
    C. Jacobsen, Verbesserung der bildverarbeitungsgestützen Riβdetektion an Schweiβnahtradiographien mit neuronalen Netzen, Thesis Kiel, 1999Google Scholar
  8. 8.
    Zell, A.: Simulation Neuronaler Netze, Addison Wesley, Bonn, Paris, 1994zbMATHGoogle Scholar
  9. 9.
    Huan Lui and Hiroshi Motoda, Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic Publishers 1998Google Scholar
  10. 10.
    Egmont­ Petersen et a., Contribution Analysis of multi-layer Perceptrons. Estimation of the Input Sources Importance for the Classification, Pattern Recognition in Practice IV, Proceedings International Workshop, Vlieland, The Netherlands, 1994, p. 347–357.Google Scholar
  11. 11.
    S.M. Weiss and C.A. Kulikowski, Computer Systems that learn, Morgan Kaufmann, 1991Google Scholar
  12. 12.
    J.R. Quinlain, Induction of Decision Trees, Machine Learning 1 (1996): 81–106Google Scholar
  13. 13.
    P. Vanroose, Luc Van Gool, and Andre Oosterlinck, Buca: A new pattern classificaiton algorithm, In Proc. 12th Prague conference on Infomation Theoretical Decision Functions and Random Processes, August 29-Sept 1, 1994Google Scholar
  14. 14.
    J.R. Quinlain“Simplifying decision trees,” Intern. Journal on Man­Machine Studies, 27 (1987): 221–234.CrossRefGoogle Scholar
  15. 15.
    P. Perner and S. Trautzsch, On Feature Partioning for Decision Tree Induction, A. Amin and P. Pudil (Eds.), SSPR98 and SPR98, Springer Verlag 1998Google Scholar
  16. 16.
    P. Perner,Decision Master,

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Carsten Jacobsen
    • 1
  • Uwe Zscherpel
    • 1
  • Petra Perner
    • 2
  1. 1.Bundesanstalt für Materialforschung und ­prüfungBerlin
  2. 2.*Institute of Computer Vision and Applied Computer Sciences LeipzigLeipzigGermany

Personalised recommendations