Skip to main content

Proactive Forest for Supervised Classification

  • Conference paper
  • First Online:
Progress in Artificial Intelligence and Pattern Recognition (IWAIPR 2018)

Abstract

Random Forest is one of the most used and accurate ensemble methods based on decision trees. Since diversity is a necessary condition to build a good ensemble, Random Forest selects a random feature subset for building decision nodes. This generation procedure could cause important features to be selected in multiple trees in the ensemble, decreasing the diversity of the entire collection. In this paper, we introduce Proactive Forest, an improvement of Random Forest that uses the information of the already generated trees to induce the remaining trees. Proactive Forest calculates the importance of each feature for the constructed ensemble in order to modify the probabilities of selecting those features in the remaining trees. In the conducted experiments, Proactive Forest increases the diversity of the obtained ensembles with a significant impact in the classifier accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Agrawal, P., Varakantham, P.: Proactive and reactive coordination of non-dedicated agent teams operating in uncertain environments. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, IJCAI 2017, pp. 28–34. AAAI Press (2017)

    Google Scholar 

  2. Bache, K., Lichman, M.: UCI machine learning repository (2013). http://archive.ics.uci.edu/ml

  3. Banfield, R.E., Hall, L.O., Bowyer, K.W., Kegelmeyer, W.P.: A new ensemble diversity measure applied to thinning ensembles. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 306–316. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-44938-8_31

    Chapter  Google Scholar 

  4. Bi, Y.: The impact of diversity on the accuracy of evidential classifier ensembles. Int. J. Approx. Reason. 53(4), 584–607 (2012)

    Article  MathSciNet  Google Scholar 

  5. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  6. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  Google Scholar 

  7. Dai, Q., Ye, R., Liu, Z.: Considering diversity and accuracy simultaneously for ensemble pruning. Appl. Soft Comput. 58, 75–91 (2017)

    Article  Google Scholar 

  8. Derrac, J., Garca, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 1(1), 3–18 (2011)

    Article  Google Scholar 

  9. García-Borroto, M., Martínez-Trinidad, J., Carrasco-Ochoa, J.: Finding the best diversity generation procedures for mining contrast patterns. Expert Syst. Appl. 42(11), 4859–4866 (2015)

    Article  Google Scholar 

  10. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)

    Article  Google Scholar 

  11. Moghadam, A.N., Ravanmehr, R.: Multi-agent distributed data mining approach for classifying meteorology data: case study on Iran’s synoptic weather stations. Int. J. Environ. Sci. Technol. 15(1), 149–158 (2018)

    Article  Google Scholar 

  12. Sheng, W., Shan, P., Chen, S., Liu, Y., Alsaadi, F.E.: A niching evolutionary algorithm with adaptive negative correlation learning for neural network ensemble. Neurocomputing 247, 173–182 (2017)

    Article  Google Scholar 

  13. Sun, Y., Wong, A.: Boosting an associative classifier. IEEE Trans. Knowl. Data Eng. 18, 988–992 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Milton García-Borroto .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cepero-Pérez, N., Denis-Miranda, L.A., Hernández-Palacio, R., Moreno-Espino, M., García-Borroto, M. (2018). Proactive Forest for Supervised Classification. In: Hernández Heredia, Y., Milián Núñez, V., Ruiz Shulcloper, J. (eds) Progress in Artificial Intelligence and Pattern Recognition. IWAIPR 2018. Lecture Notes in Computer Science(), vol 11047. Springer, Cham. https://doi.org/10.1007/978-3-030-01132-1_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01132-1_29

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01131-4

  • Online ISBN: 978-3-030-01132-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics