Abstract
This paper analyzes the problem of learning the structure of a Bayes net (BN) in the theoretical framework of Gold’s learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraint-based approaches to learning Bayes net structure, where learning is based on observed conditional dependencies between variables of interest (e.g., “X is dependent on Y given any assignment to variable Z”). Applying learning criteria in this model leads to the following results. (1) The mind change complexity of identifying a Bayes net graph over variables V from dependency data is \({\binom{|\mathbf{V}|}{2}}\), the maximum number of edges. (2) There is a unique fastest mind-change optimal Bayes net learner; convergence speed is evaluated using Gold’s dominance notion of “uniformly faster convergence”. This learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a minimum number of edges, and outputs “no guess” otherwise. Therefore we are using standard learning criteria to define a natural and novel Bayes net learning algorithm. We investigate the complexity of computing the output of the fastest mind-change optimal learner, and show that this problem is NP-hard (assuming ). To our knowledge this is the first -hardness result concerning the existence of a uniquely optimal Bayes net structure.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Angluin, D.: Inductive inference of formal languages from positive data. I&C 45, 117–135 (1980)
Bouckaert, R.: Bayesian belief networks: from construction to inference. PhD thesis, U. Utrecht (1995)
Chickering, D.: Optimal structure identification with greedy search. JMLR 3, 507–554 (2003)
Chickering, D., Heckerman, D., Meek, C.: Large-sample learning of bayesian networks is NP-hard. JMLR 5, 1287–1330 (2004)
Cooper, G.: An overview of the representation and discovery of causal relationships using bayesian networks. In: Computation, Causation, and Discovery, pp. 4–62 (1999)
Scheines, R., et al.: TETRAD 3 User’s Manual. CMU (1996)
Giere, R.: The significance test controversy. BJPS 23(2), 170–181 (1972)
Gold, E.M.: Language identification in the limit. Info. and Cont. 10(5), 447–474 (1967)
Jain, S., Osherson, D., Royer, J., Sharma, A.: Systems That Learn, 2nd edn. MIT Press, Cambridge (1999)
Jain, S., Sharma, A.: Mind change complexity of learning logic programs. TCS 284, 143–160 (2002)
Luo, W., Schulte, O.: Mind change efficient learning. In: Auer, P., Meir, R. (eds.) COLT 2005. LNCS (LNAI), vol. 3559, pp. 398–412. Springer, Heidelberg (2005)
Luo, W., Schulte, O.: Mind change efficient learning. Info. & Comp. 204, 989–1011 (2006)
Martin, E., Osherson, D.N.: Elements of Scientific Inquiry. MIT Press, Cambridge (1998)
Meek, C.: Graphical Models: Selecting causal and stat. models. PhD thesis, CMU (1997)
Neapolitan, R.E.: Learning Bayesian Networks. Pearson Education (2004)
Osherson, D., Stob, M., Weinstein, S.: Systems that learn. MIT Press, Cambridge (1986)
Papadimitriou, C.H.: Computational complexity. Addison-Wesley, London (1994)
Pearl, J.: Probabilistic Reasoning in Intelligent Systems. Morgan Kauffmann, San Francisco (1988)
Pearl, J.: Causality: Models, Reasoning, and Inference. Cambridge University Press, Cambridge (2000)
Putnam, H.: Trial and error predicates and the solution to a problem of mostowski. JSL 30(1), 49–57 (1965)
Shinohara, T.: Inductive inference of monotonic formal systems from positive data. New Gen. Comp. 8(4), 371–384 (1991)
Spirtes, P., Glymour, C., Scheines, R.: Causation, prediction, and search. MIT Press, Cambridge (2000)
Valiant, L., Vazirani, V.: NP is as easy as detecting unique solutions. TCS 47, 85–93 (1986)
Verma, T., Pearl, J.: Equiv. and synth. of causal models. In: UAI’90, pp. 220–227 (1990)
Zucchini, W.: An introduction to model selection. J. Math. Psyc. 44, 41–61 (2000)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Schulte, O., Luo, W., Greiner, R. (2007). Mind Change Optimal Learning of Bayes Net Structure. In: Bshouty, N.H., Gentile, C. (eds) Learning Theory. COLT 2007. Lecture Notes in Computer Science(), vol 4539. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72927-3_15
Download citation
DOI: https://doi.org/10.1007/978-3-540-72927-3_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72925-9
Online ISBN: 978-3-540-72927-3
eBook Packages: Computer ScienceComputer Science (R0)