Abstract
One of the most popular machine learning algorithms is gradient boosting over decision trees. This algorithm achieves high quality out of the box combined with comparably low training and inference time. However, modern machine learning applications require machine learning algorithms, that can achieve better quality in less inference time, which leads to an exploration of grading boosting algorithms over other forms of base learners. One of such advanced base learners is a piecewise linear tree, which has linear functions as predictions in leaves. This paper introduces an efficient histogram-based algorithm for building gradient boosting ensembles of such trees. The algorithm was compared with modern gradient boosting libraries on publicly available datasets and achieved better quality with a decrease in ensemble size and inference time. It was proven, that algorithm is independent of a linear transformation of individual features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Friedman, J.H.: Stochastic gradient boosting. Comput. Stat. Data Anal. 38(4), 367–378 (2002). https://doi.org/10.1016/S0167-9473(01)00065-2
Dorogush, A.V., Ershov, V., Gulin, A.: CatBoost: gradient boosting with categorical features support. arXiv preprint arXiv:1810.11363 (2018)
Gama, J.: Functional trees. Mach. Learn. 55(3), 219–250 (2004). https://doi.org/10.1023/B:MACH.0000027782.67192.13
Chaudhuri, P., et al.: Piecewise-polynomial regression trees. Statistica Sinica 4, 143–167 (1994)
Lam, S.K., Pitrou, A., Seibert, S.: Numba: a LLVM-based python JIT compiler. In: Proceedings of the Second Workshop on the LLVM Compiler Infrastructure in HPC. ACM (2015). https://doi.org/10.1145/2833157.2833162
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM (2016). https://doi.org/10.1145/2939672.2939785
Ke, G., et al.: LightGBM: a highly efficient gradient boosting decision tree. In: Advances in Neural Information Processing Systems (2017)
de Vito, L.: LinXGBoost: extension of XGBoost to generalized local linear models. arXiv preprint arXiv:1710.03634 (2017)
Shi, Y., Li, J., Li, Z.: Gradient boosting with piece-wise linear regression trees. arXiv preprint arXiv:1802.05640 (2018). APA
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Guryanov, A. (2019). Histogram-Based Algorithm for Building Gradient Boosting Ensembles of Piecewise Linear Decision Trees. In: van der Aalst, W., et al. Analysis of Images, Social Networks and Texts. AIST 2019. Lecture Notes in Computer Science(), vol 11832. Springer, Cham. https://doi.org/10.1007/978-3-030-37334-4_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-37334-4_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37333-7
Online ISBN: 978-3-030-37334-4
eBook Packages: Computer ScienceComputer Science (R0)