Parametric Models and Bayesian Methods

Part of the Springer Texts in Statistics book series (STS)

Parametric statistical models relate the observed data to the postulated stochastic mechanisms that generate them and that completely specified except for certain parameters. These parameters are assumed to be unknown and to be inferred from the data. A powerful and widely used approach to parametric inference is based on the likelihood function introduced in Section 2.4. We consider in Section 4.1 computational issues and apply maximum likelihood to regression problems. As pointed out in Section 2.4, the least squares method is equivalent to maximum likelihood when the random errors ε t are assumed to be i.i.d. normal with mean 0. There are many applications in which this assumption is clearly violated. In particular, when the response variable y t in a regression model is a binary variable that only takes the values 0 and 1, a natural extension of the linear regression function θ T x t is to relate how the parameter of the Bernoulli distribution of y t depends on θ T x t . Logistic regression provides such an extension and is described in Section 4.1.2, which also extends the basic ideas of logistic regression to more general settings, called generalized linear modelst.


Posterior Distribution Prior Distribution Bayesian Method Nonlinear Regression Model Wishart Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2008

Personalised recommendations