Abstract
One of the products sold by insurance companies is car insurance. To offer this product, one of the techniques used by the company is cold calling. This method often decreases the sellers’ mentalities because they face many rejections when offering insurance products. This problem can be reduced by classifying prospective buyers’ data first. The data can be classified as customers with the potential to buy insurance and customers who have no potential to buy insurance. From the obtained data, there are certainly many features that support the classification process. However, not all features contributed to improving classification accuracy. Machine learning especially the method of feature selection helps to reduce dimensions and to improve classification accuracy. In this paper, we examine One-Dimensional Naïve Bayes Classifier (1-DBC) as a feature selection method that is applied to two classifier methods, i.e., Support Vector Machine and Logistic Regression. Our simulations show that the two classifiers can use fewer features to produce comparable accuracies in classifying prospective car insurance buyers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
References
Schiffman, S.: Cold Calling Techniques: (That Really Work!). Adams Media, Avon (2014)
Mitchell, T.M.: Machine learning and data mining. Commun. ACM 42(11), 30–36 (1999)
Mitchell, T.M.: The Discipline of Machine Learning. Carnegie Mellon University, School of Computer Science, Machine Learning Department, Pittsburgh (2006)
Wang, S., Li, D., Song, X., Wei, Y., Li, H.: A feature selection method based on improved fisher’s discriminant ratio for text sentiment classification. Expert Syst. Appl. 38, 8696–8702 (2011)
Cinelli, M., et al.: Feature selection using a one dimensional naïve Bayes’ classifier increases the accuracy of support vector machine classification of CDR3 repertoires. Bioinformatics 33, 951–955 (2017)
Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40, 16–28 (2014)
Jain, D., Singh, V.: Feature selection and classification systems for chronic disease prediction: a review. Egyptian Inform. J. 19, 179–189 (2018)
Jalil, M.A., Mohd, F., Noor, N.M.M.: A comparative study to evaluate filtering methods for crime data feature selection. Procedia Comput. Sci. 116, 113–120 (2017)
He, B., Shi, Y., Wan, Q., Zhao, X.: Prediction of customer attrition of commercial banks based on SVM model. Procedia Comput. Sci. 31, 423–430 (2014)
Soofi, A., Awan, A.: Classification techniques in machine learning: applications and issues. J. Basic Appl. Sci. 13, 459–465 (2017)
Liu, Y.: On goodness-of-fit of logistic regression model (2007)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Vanderplas, J.: Python Data Science Handbook: Tools and Techniques for Developers. OReilly, Beijing (2016)
Malik, J.S., Goyal, P., Sharma, A.K.: A comprehensive approach towards data preprocessing techniques & association rules. In: Proceedings of the 4th National Conference (2010)
Acknowledgment
This work was supported by Universitas Indonesia under PITTA 2019 grant. Any opinions, findings, and conclusions or recommendations are the authors’ and do not necessarily reflect those of the sponsor.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Salma, D.F., Murfi, H., Sarwinda, D. (2019). The Performance of One Dimensional Naïve Bayes Classifier for Feature Selection in Predicting Prospective Car Insurance Buyers. In: Tan, Y., Shi, Y. (eds) Data Mining and Big Data. DMBD 2019. Communications in Computer and Information Science, vol 1071. Springer, Singapore. https://doi.org/10.1007/978-981-32-9563-6_13
Download citation
DOI: https://doi.org/10.1007/978-981-32-9563-6_13
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-32-9562-9
Online ISBN: 978-981-32-9563-6
eBook Packages: Computer ScienceComputer Science (R0)