Parametric Models and Bayesian Methods
Parametric statistical models relate the observed data to the postulated stochastic mechanisms that generate them and that completely specified except for certain parameters. These parameters are assumed to be unknown and to be inferred from the data. A powerful and widely used approach to parametric inference is based on the likelihood function introduced in Section 2.4. We consider in Section 4.1 computational issues and apply maximum likelihood to regression problems. As pointed out in Section 2.4, the least squares method is equivalent to maximum likelihood when the random errors ε t are assumed to be i.i.d. normal with mean 0. There are many applications in which this assumption is clearly violated. In particular, when the response variable y t in a regression model is a binary variable that only takes the values 0 and 1, a natural extension of the linear regression function θ T x t is to relate how the parameter of the Bernoulli distribution of y t depends on θ T x t . Logistic regression provides such an extension and is described in Section 4.1.2, which also extends the basic ideas of logistic regression to more general settings, called generalized linear modelst.
KeywordsPosterior Distribution Prior Distribution Bayesian Method Nonlinear Regression Model Wishart Distribution
Unable to display preview. Download preview PDF.