Introduction to Bayesian Inference

Part of the International Series in Operations Research & Management Science book series (ISOR, volume 105)

Reverend Thomas Bayes, a Presbyterian Minister who lived in England in the 18th century wrote a manuscript on “inverse probability” related to the binomial distribution. This was published posthumously in 1763. Bayes’ goal was to make probability inferences about the parameter of a binomial distribution. In 1774, Laplace stated what is now known as Bayes’ theorem in general form, working independently.

Bayesian inference combines prior beliefs about model parameters with evidence from data using Bayes’ theorem. There is a subjective interpretation of probability in this approach, compared to the “frequentist” approach in which the probability of an event is the limit of a ratio of frequencies of events. The main criticisms of Bayesian analysis have been 1) that it is not objective (a fact that has been debated for many years), and 2) that the required computations are difficult. The second criticism has been overcome to a large extent in the last 10-15 years due to advances in integration methods, particularly, Markov Chain Monte Carlo (MCMC) method. The object of this chapter is to present an introduction to statistical inference problems from a Bayesian point of view. This will lead us in the next chapter to Bayesian regression and its use in process optimization.


Posterior Distribution Markov Chain Monte Carlo Bayesian Inference Invariance Principle High Posterior Density 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2007

Personalised recommendations