Skip to main content

Une mesure d'information caracterisant la loi de poisson

  • Conference paper
  • First Online:
Séminaire de Probabilités XXI

Part of the book series: Lecture Notes in Mathematics ((SEMPROBAB,volume 1247))

Résumé

On définit une mesure d'information analogue à l'information de Fisher pour les mesures de probabilité dont le support est l'ensemble des entiers non-négatifs. Cette information possède des propriétés similaires à l'information de Fisher et donne deux caractérisations différentes de la loi de Poisson. Ceci conduit à une caractérisation des suites de mesures de probabilité ayant comme points d'accumulation des lois de Poisson.

Une partie de ce travail a été complété lorsque les auteurs étaient membres du MSRI à Berkeley. Les auteurs sont reconnaissants aux NSF et CRSNG.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Références

  1. Barron, A.R., Entropy and the central limit theorem, Annals of Probability 14, (1986), 336–342.

    Article  MathSciNet  MATH  Google Scholar 

  2. Bickel, P.J. et Freedman, D.A., Some asymptotic theory for the bootstrap, Ann. Statist. 9 (1981), 1196–1217.

    Article  MathSciNet  MATH  Google Scholar 

  3. Brown, L.D., A proof of the central limit theorem motivated by the Cramer-Rao Inequality, Statistics and Probability: Essays in Honor of C.R. Rao, Kallianpur, G. Krishnaian, P.R., Ghosh, J.K., eds. North-Holland (1982), 141–148.

    Google Scholar 

  4. Dobrushin, R.L., Describing a system of random variables by conditional distributions, Theory Probab. Appl. 15 (1970) 458–486.

    Article  MATH  Google Scholar 

  5. Dynkin E.B. et Yushekvich, A.A., Markov Processes; Theorems and Problems, Plenum Press, New York (1969).

    Book  Google Scholar 

  6. Huber, P.J., Robust Statistics, John Wiley and Sons (1981).

    Google Scholar 

  7. Kendall, D.G., Information theory and the limit theorem for Markov chains and processes with a countable infinity of states, Ann. Inst. Statist. Math. 15, (1964), 137–143.

    Article  MathSciNet  MATH  Google Scholar 

  8. Johnstone, I., Admissibility, difference equations and recurrence in estimating a Poisson mean, Ann. Statist. 12, (1984), 1173–1198.

    Article  MathSciNet  MATH  Google Scholar 

  9. Linnik, Y.V., An information-theoretic proof of the central limit theorem with the Lindeberg condition, Theory Probab. Applic. 4 (1959), 288–299.

    Article  MathSciNet  MATH  Google Scholar 

  10. Loève, M.M., Probability Theory, 3rd ed., Van Nostrand, Princeton (1963).

    MATH  Google Scholar 

  11. Mallows, C.L., A note on asymptotic joint normality, Ann. Math. Statist. 43, (1972), 508–515.

    Article  MathSciNet  MATH  Google Scholar 

  12. McKean, H.P. Jr., Speed of approach to equilibrium for Kac's caricature of a Maxwellian gas, Arch. Rational Mech. Anal. 21 (1967), 343–367.

    Article  MathSciNet  MATH  Google Scholar 

  13. Parthasarathy, K.R., Introduction to Probability and Measure, Springer, New York, (1977).

    Book  MATH  Google Scholar 

  14. Rényi, A., On measures of entropy and information, Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability, University of California Press Berkeley, 1 (1961), 541–561.

    Google Scholar 

  15. Schmidt, E., Über die Charlier-Jordansche Entwicklung einer willkürlichen funktion nach der Poissonschen funktion und ihren Ableitungen, Ztschr. f. angew. Math. und Mech. 13 (1933), 139–142.

    Article  MATH  Google Scholar 

  16. Shannon C.E. et Weaver, W., The Mathematical Theory of Communications, Univ. of Illinois Press, Urbana (1949).

    MATH  Google Scholar 

  17. Tanaka H., An inequality for a functional of probability distribution and its application to Kac's one-dimensional model of a Maxwellian gas, Z. Warsch. verw. Gebiete 27 (1973), 47–52.

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Jacques Azéma Marc Yor Paul André Meyer

Rights and permissions

Reprints and permissions

Copyright information

© 1987 Springer-Verlag

About this paper

Cite this paper

Johnstone, I.M., MacGibbon, B. (1987). Une mesure d'information caracterisant la loi de poisson. In: Azéma, J., Yor, M., Meyer, P.A. (eds) Séminaire de Probabilités XXI. Lecture Notes in Mathematics, vol 1247. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0077656

Download citation

  • DOI: https://doi.org/10.1007/BFb0077656

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-17768-5

  • Online ISBN: 978-3-540-47814-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics