Posterior Analysis of Stochastic Frontier Models with Truncated Normal Errors
- 492 Downloads
Previous work in stochastic frontier models with exponentially distributed one-sided errors using both Gibbs sampling and Monte Carlo integration with importance sampling reveals the enormous computational gains that can be achieved using the former. This paper takes up inference in another interesting class of stochastic frontier models, those with truncated normal one-sided error terms, and shows that posterior simulation involves drawing from standard or log-concave distributions, implying that Gibbs sampling is an efficient solution to the Bayesian integration problem. The sampling behavior of the Bayesian procedure is investigated using a Monte Carlo experiment. The method is illustrated using US airline data.
KeywordsStochastic frontier model Efficiency Truncated normal distribution Bayesian analysis Gibbs sampling
- Brooks S.P., Deilaportas, P. and Roberts, G.O., 1997, A total variation method for diagnosing convergence of MCMC algorithms. Journal of Computational and Graphical Statistics, forthcoming.Google Scholar
- Brooks, S.P. and Gelman, A., 1997, Alternative methods for monitoring convergence of iterative simulations. Journal of Computational and Graphical Statistics, forthcoming.Google Scholar
- Brooks, S.P. and Roberts, G.O., 1998, Assessing convergence of Markov Chain Monte Carlo algorithms, manuscript, School of Mathematics, University of Bristol.Google Scholar