Advertisement

Fisher’s Development of Conditional Inference

  • David Hinkley
Part of the Lecture Notes in Statistics book series (LNS, volume 1)

Abstract

In previous lectures we have examined aspects of Fisher’s theory of statistical estimation, where much attention was given to sufficiency, efficiency, and information. In essence the theory is a likelihood-based theory, proposed as an alternative to the then-popular Bayesian theory of Laplace.

Keywords

Conditional Distribution Monte Carlo Estimate Conditional Inference Ancillary Statistic Generate Frequency Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barndorff-Nielsen, O. (1978). Information and Exponential Families in Statistical Theory. New York: John Wiley & Sons.zbMATHGoogle Scholar
  2. Cox, D.R. and D.V. Hinkley (1974). Theoretical Statistics. London: Chapman and Hall.zbMATHGoogle Scholar
  3. Efron, B. and D.V. Hinkley (1978). “Assessing the Accuracy of the Maximum Likelihood Estimator: Observed Versus Expected Fisher Information,” Biometrika, 65, 457–482.MathSciNetzbMATHCrossRefGoogle Scholar
  4. Kalbfleisch, J.D. (1975). “Sufficiency and Conditionality (with Discussion),” Biometrika, 62, 251–268.MathSciNetzbMATHCrossRefGoogle Scholar
  5. Plackett, R.L. (1977). “The Marginal Totals of a 2x2 Table,” Biometrika, 64, 37–42.MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1980

Authors and Affiliations

  • David Hinkley

There are no affiliations available

Personalised recommendations