This chapter presents an overview of some basic concepts in probability theory which are important for understanding probabilistic graphical models. First, the main interpretations and mathematical definition of probability are introduced. Second, the basic rules of probability theory are presented, including the concept of conditional independence and Bayes rule. Third, an overview of random variables and some important distributions are described. Lastly, the basics of information theory are presented.
Cumulative Distribution Function Product Rule Conditional Independence Joint Probability Distribution Discrete Random Variable
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in to check access.
Gillies, D.: Philosophical Theories of Probability. Routledge, London (2000)Google Scholar
Jaynes, E.T.: Probability Theory: The Logic of Science. Cambridge University Press, Cambridge (2003)CrossRefGoogle Scholar
MacKay, D.J.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2004)Google Scholar