Asymptotic Comparison of Estimators in the Ising Model
Because of their use as priors in image analysis, the interest in parameter estimation for Gibbs random fields has rosen recently. Gibbs fields form an exponential family, so maximum likelihood would be the estimator of first choice. Unfortunately it is extremly difficult to compute. Other estimators which are easier to compute have been proposed: the Coding and the pseudo-maximum likelihood estimator (Besag, 1974), a minimum chi-square estimator (Glötzl and Rauchenschwandtner, 1981; Possolo, 1986-a) and the conditional least squares estimator (Lele et Ord, 1986), cf the definitions below in section 2.2. - These estimators are all known to be consistent. Hence it is a natural question to compare efficiency among these simple estimators and with respect to the maximum likehood estimator. We do this here in the simplest non trivial case, the d-dimensional nearest neighbor isotropic Ising model with external field. We show that both the pseudo maximum likelihood and the conditional least squares estimator are asymptotically equivalent to a minimum chi-square estimator when the weight matrix for the latter is chosen appropriately (corollary 2). These weight matrices are different from the optimal matrix. Hence we expect also the resulting estimators to be different although in all our examples the maximum pseudo likelihood and the minimum chi-square estimator with optimal weight turned out to be asymptotically equivalent. In particular, our results do not confirm the superior behavior of minimum chi-square over pseudo maximun likehood reported in Possolo (1986a). By example, we show that conditional least squares and minimum chi-square with the identity matrix as weights can be worse than the optimal minimum chi-square estimator. Compared with the maximum likelihood, the easily computable estimators are not bad if the interaction is weak, but much worse if the interaction is strong. Our results suggest that their asymptotic efficiency tends to zero as one approaches the critical point.
Unable to display preview. Download preview PDF.
- Besag J., 1976, Parameter estimation for Markov fields, Tech. report serie n° 2, Dept. of Stat., University of Princeton.Google Scholar
- Besag J., 1977, Efficiency of pseudo-likelihood estimation for simple Gaussian fields, Biometrika, p. 616–618.Google Scholar
- Comets F., 1989, On Consistency of a class of estimators for exponential family of Markov random fields on the lattice, préprint. Orsay n° 89–30.Google Scholar
- Comets F. and Gidas B., 1991, Asymptotics of Maximum Likelihood Estimator for the Curie - Weiss model, to appear in Ann. of Stat.Google Scholar
- Georgii H.S., 1988, Gibbs measures and phase transition, Studies in Maths. 9, de Gruyter, Berlin.Google Scholar
- GlÖtzl E., Rauchenschwandtner B., 1981, On the statistics of Gibbsian processes, in the Pst Pannomian on Math. Stat., Eds Revesz P., Schmetterer L., Zolotarev V.M., Lect. Notes in Stat. 8, Springer-Verlag.Google Scholar
- Guyon X., 1987, Estimation d’un champ par pseudo-vraisemblance et application au cas markovien, Spatial Proc. and Sp. Time Series, Proced. 6th Franco-Belgian Meeting 1985, Ed. Droesbeke, Pub. Fac. Univ. Saint Louis, Bruxelles, p. 15–62.Google Scholar
- Guyon X. and Hardouin C., 1990, The Chisquare Coding test for nested Markov random Fields hypothesis, same volume.Google Scholar
- Kindermann R. and Snell J.L., 1980, Markov Random Fields and their Applications, Contemporary Maths, n° 1, AMS, Providence.Google Scholar
- Lele S.R., Ord J.K., 1986, Conditional Least Squares estimation for Spatial Processes: some asymptotics results, Tech. Rep. n° 65, Dept. Stat., The Pensylvania State Univ.Google Scholar
- Martin-LÖf, 1973, Mixing properties, differentiability of the free energy and Central Limit Theorem for a pure phase in the Ising model at low temperature, Comm. Math. Phys, 32, p. 75–92.Google Scholar
- Possolo A., 1986-a, Estimation of binary Markov random fields, Tech. report n° 77, Dept. Stat., Univ. of Washington.Google Scholar
- Possolo A., 1986-b Sub-sampling a random field, Tech. Report n° 78, Dept. Stat., Univ. of Washington.Google Scholar