Advertisement

Confidence in Prediction: An Approach for Dynamic Weighted Ensemble

  • Duc Thuan Do
  • Tien Thanh NguyenEmail author
  • The Trung Nguyen
  • Anh Vu Luong
  • Alan Wee-Chung Liew
  • John McCall
Conference paper
  • 294 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12033)

Abstract

Combining classifiers in an ensemble is beneficial in achieving better prediction than using a single classifier. Furthermore, each classifier can be associated with a weight in the aggregation to boost the performance of the ensemble system. In this work, we propose a novel dynamic weighted ensemble method. Based on the observation that each classifier provides a different level of confidence in its prediction, we propose to encode the level of confidence of a classifier by associating with each classifier a credibility threshold, computed from the entire training set by minimizing the entropy loss function with the mini-batch gradient descent method. On each test sample, we measure the confidence of each classifier’s output and then compare it to the credibility threshold to determine whether a classifier should be attended in the aggregation. If the condition is satisfied, the confidence level and credibility threshold are used to compute the weight of contribution of the classifier in the aggregation. By this way, we are not only considering the presence but also the contribution of each classifier based on the confidence in its prediction on each test sample. The experiments conducted on a number of datasets show that the proposed method is better than some benchmark algorithms including a non-weighted ensemble method, two dynamic ensemble selection methods, and two Boosting methods.

Keywords

Supervised learning Classification Ensemble method Ensemble learning Multiple classifier system Weighted ensemble 

References

  1. 1.
    Bakker, B., Heskes, T.: Clustering ensembles of neural network models. Neural Netw. 16(2), 261–269 (2003)CrossRefGoogle Scholar
  2. 2.
    Chen, H., Tiňo, P., Yao, X.: Predictive ensemble pruning by expectation propagation. IEEE Trans. Knowl. Data Eng. 21(7), 999–1013 (2009)CrossRefGoogle Scholar
  3. 3.
    Dang, M.T., Luong, A.V., Vu, T.-T., Nguyen, Q.V.H., Nguyen, T.T., Stantic, B.: An ensemble system with random projection and dynamic ensemble selection. In: Nguyen, N.T., Hoang, D.H., Hong, T.-P., Pham, H., Trawiński, B. (eds.) ACIIDS 2018. LNCS (LNAI), vol. 10751, pp. 576–586. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-75417-8_54CrossRefGoogle Scholar
  4. 4.
    Demiriz, A., Bennett, K.P., Shawe-Taylor, J.: Linear programming boosting via column generation. Mach. Learn. 46(1–3), 225–254 (2002)CrossRefGoogle Scholar
  5. 5.
    Freund, Y., Schapire, R.E., et al.: Experiments with a new boosting algorithm. In: ICML, vol. 96, pp. 148–156. Citeseer (1996)Google Scholar
  6. 6.
    Kim, K.J., Cho, S.B.: An evolutionary algorithm approach to optimal ensemble classifiers for DNA microarray data analysis. IEEE Trans. Evol. Comput. 12(3), 377–388 (2008)CrossRefGoogle Scholar
  7. 7.
    Ko, A.H., Sabourin, R., Britto Jr., A.S.: From dynamic classifier selection to dynamic ensemble selection. Pattern Recogn. 41(5), 1718–1731 (2008)CrossRefGoogle Scholar
  8. 8.
    Kuncheva, L.I., Bezdek, J.C., Duin, R.P.: Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recogn. 34(2), 299–314 (2001)CrossRefGoogle Scholar
  9. 9.
    Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: ICML. vol. 97, pp. 211–218. Citeseer (1997)Google Scholar
  10. 10.
    Nguyen, T.T., Dang, M.T., Liew, A.W., Bezdek, J.C.: A weighted multiple classifier framework based on random projection. Inf. Sci. 490, 36–58 (2019)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Nguyen, T.T., Nguyen, M.P., Pham, X.C., Liew, A.W.C., Pedrycz, W.: Combining heterogeneous classifiers via granular prototypes. Appl. Soft Comput. 73, 795–815 (2018)CrossRefGoogle Scholar
  12. 12.
    Nguyen, T.T., Nguyen, T.T.T., Pham, X.C., Liew, A.W.C.: A novel combining classifier method based on variational inference. Pattern Recogn. 49, 198–212 (2016)CrossRefGoogle Scholar
  13. 13.
    Nguyen, T.T., Pham, X.C., Liew, A.W.C., Pedrycz, W.: Aggregation of classifiers: a justifiable information granularity approach. IEEE Trans. Cybern. 49(6), 2168–2177 (2018)CrossRefGoogle Scholar
  14. 14.
    Seiffert, C., Khoshgoftaar, T.M., Van Hulse, J., Napolitano, A.: Rusboost: improving classification performance when training data is skewed. In: 2008 19th International Conference on Pattern Recognition, pp. 1–4. IEEE (2008)Google Scholar
  15. 15.
    Ting, K.M., Witten, I.H.: Issues in stacked generalization. J. Artif. Intell. Res. 10, 271–289 (1999)CrossRefGoogle Scholar
  16. 16.
    Woloszynski, T., Kurzynski, M., Podsiadlo, P., Stachowiak, G.W.: A measure of competence based on random classification for dynamic ensemble selection. Inf. Fusion 13(3), 207–213 (2012)CrossRefGoogle Scholar
  17. 17.
    Wu, O.: Classifier ensemble by exploring supplementary ordering information. IEEE Trans. Knowl. Data Eng. 30(11), 2065–2077 (2018)Google Scholar
  18. 18.
    Yijing, L., Haixiang, G., Xiao, L., Yanan, L., Jinling, L.: Adapted ensemble classification algorithm based on multiple classifier system and feature selection for classifying multi-class imbalanced data. Knowl.-Based Syst. 94, 88–104 (2016)CrossRefGoogle Scholar
  19. 19.
    Zhang, Y., Burer, S., Street, W.N.: Ensemble pruning via semi-definite programming. J. Mach. Learn. Res. 7, 1315–1338 (2006)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.School of Applied Mathematics and InformaticsHanoi University of Science and TechnologyHanoiVietnam
  2. 2.School of Computing Science and Digital MediaRobert Gordon UniversityAberdeenUK
  3. 3.School of Information and Communication TechnologyHanoi University of Science and TechnologyHanoiVietnam
  4. 4.School of Information and Communication TechnologyGriffith UniversityGold CoastAustralia

Personalised recommendations