Advertisement

An Improved Attribute Value-Weighted Double-Layer Hidden Naive Bayes Classification Algorithm

  • Huanying Zhang
  • Yushui GengEmail author
  • Fei Wang
Conference paper
  • 4 Downloads
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1143)

Abstract

The Hidden Naive Bayes (HNB) classification algorithm is a kind of structurally extended Naive Bayesian classification algorithm, which introduces a hidden parent node for each attribute so that the dependencies between attributes are utilized. However, in the classification process, the effect of the attribute pair on the attribute is ignored. Therefore, the double-layer Hidden Naive Bayes (DHNB) classification algorithm fully considers the dependence between attribute pairs and the attributes. However, he did not consider the contribution of different values of each feature attribute to the classification. To solve this problem, an improved DHNB algorithm was obtained by constructing a corresponding weighting function to calculate the contribution degree of each feature attribute value to the classification and using the obtained weighting function to weight the formula in the DHNB algorithm. Finally, the improved algorithm was simulated experiment on the University of California Irvine (UCI). The results show that the improved algorithm has higher classification efficiency than the original DHNB algorithm, and the method has good applicability.

Keywords

Hidden Naive Bayes Double-layer Hidden Naive Bayes Weighting function Classification efficiency 

References

  1. 1.
    Gholizadeh, A., Carmon, N., Klement, A., et al.: Agricultural soil spectral response and properties assessment: effects of measurement protocol and data mining technique. Remote. Sens. 9(10), 1078 (2017)CrossRefGoogle Scholar
  2. 2.
    Gallagher, C., Madden, M. G., D’Arcy, B.: A bayesian classification approach to improving performance for a real-world sales forecasting application. In: IEEE International Conference on Machine Learning & Applications. IEEE (2016)Google Scholar
  3. 3.
    Spiegler, R.: Bayesian networks and boundedly rational expectations. Q. J. Econ. 131(3) (2016)Google Scholar
  4. 4.
    Lee, C.H., Gutierrez, F., Dou, D.: Calculating feature weights in naive Bayes with kullback-Leibler measure. In: IEEE International Conference on Data Mining (2012)Google Scholar
  5. 5.
    Jiang, L.X., Cai, A.H., Zhang, H., et al.: Naive Bayes text classifiers: a locally weighted learning approach. J. Exp. Theor. Artif. Intell. 25(2), 14 (2013)CrossRefGoogle Scholar
  6. 6.
    Zhang, H., Sheng, S.: Learning weighted naive Bayes with accurate ranking. In: Fourth IEEE International Conference on Data Mining (ICDM’04). IEEE, pp. 567–570 (2004)Google Scholar
  7. 7.
    Hall, M.A.: Correlation-based feature selection for discrete and numeric class machine learning. In: Seventeenth International Conference on Machine Learning (2000)Google Scholar
  8. 8.
    Frank, E., Hall, M., Pfahringer, B.: Locally weighted naive bayes. In: Nineteenth Conference on Uncertainty in Artificial Intelligence (2003)Google Scholar
  9. 9.
    Hall, M.: A decision tree-based attribute weighting filter for naive Bayes (2007)Google Scholar
  10. 10.
    Li, J.H., Xiao-Gang, Z., Hua, C., et al.: Improved algorithm for learning hidden naive Bayes. J. Chin. Comput. Syst. 21(10), 1361–1371 (2013)Google Scholar
  11. 11.
    Wang, X., Du, T.: Improved weighted naive bayesian classification algorithm based on attribute selection. Comput. Syst. Appl. 24(8), 149–154 (2015)Google Scholar
  12. 12.
    Qin, H.Q., Zhao, M.X.: Hidden naive bayes algorithm based on attribute values weighting. Joural Shandong Univ. Sci. Technol. (Nat. Sci.) 37(3), 73–78 (2018)MathSciNetGoogle Scholar
  13. 13.
    Zhang, H., Jiang, L., Su, J.: Hidden naive Bayes. In: Proceedings, the Twentieth National Conference on Artificial Intelligence and the Seventeenth Innovative Applications of Artificial Intelligence Conference, 9–13 July 2005. AAAI Press, Pittsburgh, PA (2005)Google Scholar
  14. 14.
    Ferreira, J., Denison, D.G.T., Hand, D.J.: Weighted naive Bayes modelling for data mining (2001)Google Scholar
  15. 15.
    Xiang, Z.L., Yu, X.R., Kang, D.K.: Experimental analysis of naive Bayes classifier based on an attribute weighting framework with smooth kernel density estimations. Appl. Intell. 44(3) (2015)Google Scholar
  16. 16.
    Frank, A., Asuncion, A.: UCI machine learning repository. University of California, Irvine, School of Information and Computer Science. http://archive.ics.uci.edu/ml (2010)
  17. 17.
    Abraham, R., Simha, J.B., Iyengar. S.S.: A comparative analysis of discretization methods for medical data mining with naive Bayesian classifier. In: International Conference on Information Technology (2006)Google Scholar
  18. 18.
    Witten, I.H., Frank, E., Hall, M.A., Booksx, I.: Data mining: Practical machine learning tools and techniques, Third Edition (2005)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2021

Authors and Affiliations

  1. 1.School of InformationQilu University of Technology, Shandong Academy of SciencesJinanChina
  2. 2.Graduate SchoolQilu University of Technology, Shandong Academy of SciencesJinanChina

Personalised recommendations