Large-scale Ensemble Model for Customer Churn Prediction in Search Ads
- 95 Downloads
Customer churn prediction is one of the most important issues in search ads business management, which is a multi-billion market. The aim of churn prediction is to detect customers with a high propensity to leave the ads platform, then to do analysis and increase efforts for retaining them ahead of time. Ensemble model combines multiple weak models to obtain better predictive performance, which is inspired by human cognitive system and is widely used in various applications of machine learning. In this paper, we investigate how the ensemble model of gradient boosting decision tree (GBDT) to predict whether a customer will be a churner in the foreseeable future based on its activities in the search ads. We extract two types of features for the GBDT: dynamic features and static features. For dynamic features, we consider a sequence of customers’ activities (e.g., impressions, clicks) during a long period. For static features, we consider the information of customers setting (e.g., creation time, customer type). We evaluated the prediction performance in a large-scale customer data set from Bing Ads platform, and the results show that the static and dynamic features are complementary, and get the AUC (area under the curve of ROC) value 0.8410 on the test set by combining all features. The proposed model is useful to predict those customers who will be churner in the near future on the ads platform, and it has been successfully daily run on the Bing Ads platform.
KeywordsChurn prediction Ensemble model Machine learning Search ads Static features Dynamic features
We also would like to thank all of the members in Bing Ads Adinsight team and PM team at Microsoft for their discussion and help on this work.
This study was funded by Natural Science Foundation of the Jiangsu Higher Education Institutions of China under no. 17KJB520041 and 17KJD520010; Natural Science Foundation of Jiangsu Province BK20181189 and BK20181190; Open Project Fund of the National Laboratory of Pattern Recognition 201800020, Key Program Special Fund in XJTLU under no. KSF-A-10, KSF-A-01 and KSF-P-02; and XJTLU Research Development Fund RDF-16-02-49. In addition, A. Hussain was supported by the UK Engineering and Physical Sciences Research Council (EPSRC) grant (AV-COGHEAR, grant reference number: EP/M026981/1).
Compliance with Ethical Standards
Conflict of interest
The authors declare that they have no conflict of interest.
This article does not contain any studies with human participants performed by any of the authors.
Informed consent was obtained from all individual participants included in the study.
- 4.Yoon S, Koehler J, Ghobarah A. 2010. Prediction of advertiser churn for google adwords jsm proceedings.Google Scholar
- 9.Qureshi SA, Rehman AS, Qamar AM, et al. 2014. Telecommunication subscribersćhurn prediction model using machine learning, 8th International Conference on Digital Information Management. IEEE. pp. 131–136.Google Scholar
- 11.Xie Y, Xiu L. 2008. Churn prediction with linear discriminant boosting algorithm. IEEE International Conference on Machine Learning and Cybernetics, pp. 228–233.Google Scholar
- 17.Ngonmang B, Viennet E, Tchuente M. Churn prediction in a real online social network using local community analysis. Proceedings of the 2012 International Conference on Advances in Social Networks Analysis and Mining; 2012. p. 282–288.Google Scholar
- 18.Borbora ZH, Srivastava J. User behavior modelling approach for churn prediction in online games. 2012 international conference on privacy, security, risk and trust, PASSAT 2012, and 2012 international conference on social computing, SocialCom 2012, Amsterdam, Netherlands; 2012. p. 51–60.Google Scholar
- 19.Runge J, Gao P, Garcin F, et al. Churn prediction for high-value players in casual social games. 2014 IEEE conference on Computational Intelligence and Games; 2014. p. 1–8.Google Scholar
- 28.Ayerdi B, Savio A, Graña M. Meta-ensembles of classifiers for Alzheimerś disease detection using independent ROI features. Natural and Artificial Computation in Engineering and Medical Applications. Springer; 2013. pp. 122–130.Google Scholar
- 30.Mogultay H, Vural F T Y. Cognitive learner: an ensemble learning architecture for cognitive state classification. IEEE 25th Signal Processing and Communications Applications Conference; 2017. p. 1–4.Google Scholar
- 32.Goodfellow Ian, Bengio Yoshua, Courville A. Deep Learning. Cambridge: MIT Press; 2016.Google Scholar
- 33.Duda RO, Hart PE, Stork DG. Pattern classification, 2nd ed. New York: Wiley; 2001.Google Scholar
- 36.Meher AK, Wilson J, Prashanth R. 2017. Towards a large scale practical churn model for prepaid mobile markets. Advances in Data Mining Applications and Theoretical Aspects, pp. 93–106.Google Scholar
- 37.Li R, Wang P, Chen Z. A feature extraction method based on stacked auto-encoder for telecom churn prediction. In: Zhang L, Song X, and Wu Y, editors. Theory, Methodology, Tools and Applications for Modeling and Simulation of Complex Systems. AsiaSim 2016, SCS AutumnSim. Communications in Computer and Information Science. Singapore: Springer; 2016.Google Scholar
- 38.Chamberlain BP, Cardoso A, Liu CHB, et al. Customer lifetime value prediction using embeddings. Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining; 2017. p. 1753–1762.Google Scholar
- 42.Hadiji F, Sifa R, Drachen A, et al. Predicting player churn in the wild. IEEE conference on Computational intelligence and games (CIG). IEEE; 2014. p. 1–8.Google Scholar
- 52.Wangperawong A, Brun C, Laudy O, et al. 2016. Churn analysis using deep convolutional neural networks and autoencoders. arXiv:1604.05377.
- 53.Kasiran Z, Ibrahim Z, Mohd Ribuan MS. Customer churn prediction using recurrent neural network with reinforcement learning algorithm in mobile phone users. Int J Int Inf Process 2014;5(1):1–11.Google Scholar
- 54.Spanoudes P, Nguyen T. 2017. Deep learning in customer churn prediction: unsupervised feature learning on abstract company independent feature vectors. arXiv:1703.03869.
- 55.Chen T. 2014. Introduction to boosted trees, University Of Washington. http://homes.cs.washington.edu/~tqchen/pdf/BoostedTree.pdf.