An Evolutionary Algorithm for Global Induction of Regression Trees

  • Marek Krȩtowski
  • Marcin Czajkowski
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6114)


In the paper a new evolutionary algorithm for induction of univariate regression trees is proposed. In contrast to typical top-down approaches it globally searches for the best tree structure and tests in internal nodes. The population of initial trees is created with diverse top-down methods on randomly chosen sub-samples of the training data. Specialized genetic operators allow the algorithm to efficiently evolve regression trees. The complexity term introduced in the fitness function helps to mitigate the over-fitting problem. The preliminary experimental validation is promising as the resulting trees can be significantly less complex with at least comparable performance to the classical top-down counterpart.


Root Mean Square Error Evolutionary Algorithm Regression Tree Internal Node Global Induction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Blake, C., Keogh, E., Merz, C.: UCI repository of machine learning databases (1998),
  2. 2.
    Breiman, L., Friedman, J., Olshen, R., Stone, C.: Classification and Regression Trees. Wadsworth Int. Group (1984)Google Scholar
  3. 3.
    Dobra, A., Gehrke, J.: SECRET: A scalable linear regression tree algorithm. In: Proc. KDD 2002 (2002)Google Scholar
  4. 4.
    Fayyad, U., Piatetsky-Shapiro, G., Smyth, P., Uthurusamy, R. (eds.): Advances in Knowledge Discovery and Data Mining. AAAI Press, Menlo Park (1996)Google Scholar
  5. 5.
    Frank, E., et al.: Weka 3 - Data Mining with Open Source Machine Learning Software in Java. University of Waikato (2000),
  6. 6.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of statistical Learning. Data Mining, Inference, and Prediction, 2nd edn. Springer, Heidelberg (2009)zbMATHGoogle Scholar
  7. 7.
    Koza, J.: Concept formation and decision tree induction using genetic programming paradigm. In: Schwefel, H.-P., Männer, R. (eds.) PPSN 1990. LNCS, vol. 496, pp. 124–128. Springer, Heidelberg (1991)CrossRefGoogle Scholar
  8. 8.
    Krȩtowski, M., Grześ, M.: Global learning of decision trees by an evolutionary algorithm. In: Information Processing and Security Systems, pp. 401–410. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  9. 9.
    Krȩtowski, M., Grześ, M.: Evolutionary learning of linear trees with embedded feature selection. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Żurada, J.M. (eds.) ICAISC 2006. LNCS (LNAI), vol. 4029, pp. 400–409. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  10. 10.
    Krȩtowski, M., Grześ, M.: Evolutionary induction of mixed decision trees. International Journal of Data Warehousing and Mining 3(4), 68–82 (2007)Google Scholar
  11. 11.
    Malerba, D., Esposito, F., Ceci, M., Appice, A.: Top-down induction of model trees with regression and splitting nodes. IEEE Trans. on PAMI 26(5), 612–625 (2004)Google Scholar
  12. 12.
    Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs, 3rd edn. Springer, Heidelberg (1996)zbMATHGoogle Scholar
  13. 13.
    Quinlan, J.: Learning with continuous classes. In: Proc. AI 1992, pp. 343–348. World Scientific, Singapore (1992)Google Scholar
  14. 14.
    Torgo, L.: Inductive learning of tree-based regression models. Ph.D. Thesis, University of Porto (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Marek Krȩtowski
    • 1
  • Marcin Czajkowski
    • 1
  1. 1.Faculty of Computer ScienceBialystok University of TechnologyBiałystokPoland

Personalised recommendations