Skip to main content

GRASP Forest: A New Ensemble Method for Trees

  • Conference paper
Book cover Multiple Classifier Systems (MCS 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 6713))

Included in the following conference series:

Abstract

This paper proposes a method for constructing ensembles of decision trees: GRASP Forest. This method uses the metaheuristic GRASP, usually used in optimization problems, to increase the diversity of the ensemble. While Random Forest increases the diversity by randomly choosing a subset of attributes in each tree node, GRASP Forest takes into account all the attributes, the source of randomness in the method is given by the GRASP metaheuristic. Instead of choosing the best attribute from a randomly selected subset of attributes, as Random Forest does, the attribute is randomly chosen from a subset of selected good attributes candidates. Besides the selection of attributes, GRASP is used to select the split value for each numeric attribute. The method is compared to Bagging, Random Forest, Random Subspaces, AdaBoost and MutliBoost, being the results very competitive for the proposed method.

This work was supported by the Project TIN2008-03151 of the Spanish Ministry of Education and Science.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley Interscience, Hoboken (2004)

    Book  MATH  Google Scholar 

  2. Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)

    MATH  Google Scholar 

  3. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20, 832–844 (1998)

    Article  Google Scholar 

  4. Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: Proceedings of the Thirteenth International Conference on Machine Learning (ICML), pp. 148–156 (1996)

    Google Scholar 

  5. Webb, G.I.: MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning 40, 159–196 (2000)

    Article  Google Scholar 

  6. Breiman, L.: Random forests. Machine learning 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  7. Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40, 139–157 (2000), doi:10.1023/A:1007607513941

    Article  Google Scholar 

  8. Maudes, J., Rodríguez, J.J., Garcïa-Osorio, C., Garcïa-Pedrajas, N.: Random feature weights for decision tree ensemble construction. Information Fusion (2010)

    Google Scholar 

  9. Quinlan, R.J.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  10. Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. Chapman and Hall, Boca Raton (1984)

    MATH  Google Scholar 

  11. Feo, T., Resende, M.: A probabilistic heuristic for a computationally difficult set covering problem. Operations Research Letters 8, 67–71 (1989)

    Article  MATH  Google Scholar 

  12. Feo, T., Resende, M.: Greedy randomized adaptive search procedures. Journal of Global Optimization 6, 109–133 (1995)

    Article  MATH  Google Scholar 

  13. Pacheco, J., Alfaro, E., Casado, S., Gámez, M., García, N.: Uso del metaheurístico GRASP en la construcción de árboles de clasificación. Rect@ 11, 139–154 (2010)

    Google Scholar 

  14. Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine Learning 63, 3–42 (2006)

    Article  MATH  Google Scholar 

  15. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The weka data mining software: an update. SIGKDD Explor. Newsl. 11, 10–18 (2009)

    Article  Google Scholar 

  16. Nadeau, C., Bengio, Y.: Inference for the generalization error. Machine Learning 52, 239–281 (2003), doi:10.1023/A:1024068626366

    Article  MATH  Google Scholar 

  17. Dietterich, T.: Approximate statistical tests for comparing supervised classification learning algorithms. Neural computation 10, 1895–1923 (1998)

    Article  Google Scholar 

  18. Frank, A., Asuncion, A.: UCI machine learning repository (2010)

    Google Scholar 

  19. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. The Journal of Machine Learning Research 7, 1–30 (2006)

    MATH  Google Scholar 

  20. Castiello, C., Castellano, G., Fanelli, A.: Meta-data: Characterization of input features for meta-learning. In: Modeling Decisions for Artificial Intelligence, pp. 457–468 (2005)

    Google Scholar 

  21. Ho, T., Basu, M.: Complexity measures of supervised classification problems. IEEE Trans. Pattern Anal. Mach. Intell. 24, 289–300 (2002), info:doi:10.1109/34.990132

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Diez-Pastor, J.F., García-Osorio, C., Rodríguez, J.J., Bustillo, A. (2011). GRASP Forest: A New Ensemble Method for Trees. In: Sansone, C., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2011. Lecture Notes in Computer Science, vol 6713. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21557-5_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21557-5_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21556-8

  • Online ISBN: 978-3-642-21557-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics