Managing Monotonicity in Classification by a Pruned Random Forest
In ordinal monotonic classification problems, the class variable should increase according to a subset of explanatory variables. Standard classifiers do not guarantee to produce model that satisfy the monotonicity constraints. Some algorithms have been developed to manage this issue, such as decision trees which have modified the growing and pruning mechanisms. In this contribution we study the suitability of using these mechanisms in the generation of Random Forests. We introduce a simple ensemble pruning mechanism based on the degree of monotonicity. After an exhaustive experimental analysis, we deduce that a Random Forest applied over these problems is able to achieve a slightly better predictive performance than standard algorithms.
KeywordsMonotonic classification Decision tree induction Random forest Ensemble pruning
This work is supported by the research project TIN2014-57251-P and by a research scholarship, given to the author Sergio Gonzalez by the University of Granada.
- 7.Ben-David, A.: Monotonicity maintenance in information-theoretic machine learning algorithms. Mach. Learn. 19, 29–43 (1995)Google Scholar
- 8.Alcala-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple Valued Logic Soft Comput. 17, 255–287 (2011)Google Scholar