Abstract
Generally speaking, dependency analysis based Bayesian network learning algorithms are of higher efficiency. J. Cheng’s algorithm is a representative of this kinds of algorithms, while its efficiency could be improved further. This paper presents an efficient Bayesian network learning algorithm, which is an improvement to J. Cheng’s algorithm that uses Mutual Information (MI) and Conditional Mutual Information (CMI) as Conditional Independence (CI) tests. Through redefining the equations for calculating MI and CMI, our algorithm could decrease a large number of basic operations such as logarithms, divisions etc. and reduce the times of access to datasets to the minimum. Moreover, to efficiently calculate CMI, an efficient method for finding an approximate minimum cut-set is proposed in our algorithm. Experimental results show that under the same accuracy, our algorithm is much more efficient than J. Cheng’s algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Acid, S., Campos, L.M.: An algorithm for finding minimum d-Separating sets in belief networks. In: Proceedings of the twelfth Conference of Uncertainty in Artificial Intelligence (1996)
Buntine, W.: A guide to the literature on learning probabilistic networks from data. IEEE Transactions on Knowledge and Data Engineering 8, 195–210 (1996)
Cheng, J., Bell, D.A., Liu, W.: Learning belief networks from data: An information theory based approach. In: Proceeding of the sixth ACM International Conference on Information and Knowledge Management (1997)
Cheng, J., Greiner, R., Kelly, J., Bell, D.A., Liu, W.: Learning Bayesian Networks from Data: an Information-Theory Based Approach. The Artificial Intelligence Journal 137, 43–90 (2002)
Dash, D., Druzdzel, M.J.: Robust Independence Testing for Constraint-Based Learning of Causal Structure. In: UAI 2003, pp. 167–174 (2003)
Friedman, N.: The Bayesian structural EM algorithm. In: Fourteenth Conf. on Uncertainty in Artificial Intelligence (1998)
Heckerman, D., Geiger, D., Chickering, D.: Learning Bayesian networks: The combination of knowledge and statistical data. Machine Learning 20, 197–243 (1995)
Lam, W., Bacchus, F.: Learning Bayesian belief networks: An approach based on the MDL principle. Computational Intelligence 10, 269–293 (1994)
Madsen, A.L., Jensen, F.V.: Lazy propagation: a junction tree inference algorithm based on lazy evaluation. Artificial Intelligence 113, 203–245 (1999)
Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, Inc., San Mateo (1988)
Tian, F., Lu, Y., Shi, C.: Learning bayesian networks with hidden variables using the combination of em and evolutionary algorithm. In: Proceedings of the 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, pp. 568–574. Springer, Heidelberg (2001)
Tian, F., Lu, Y., Shi, C.: Learning bayesian networks from incomplete data based on EMI method. In: Preceedings of ICDM 2003, Melbourne, Florida, USA, pp. 323–330 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tian, F., Tian, S., Yu, J., Huang, H. (2005). An Improved Bayesian Network Learning Algorithm Based on Dependency Analysis. In: Hao, Y., et al. Computational Intelligence and Security. CIS 2005. Lecture Notes in Computer Science(), vol 3801. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11596448_5
Download citation
DOI: https://doi.org/10.1007/11596448_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-30818-8
Online ISBN: 978-3-540-31599-5
eBook Packages: Computer ScienceComputer Science (R0)