Résumé
On définit une mesure d'information analogue à l'information de Fisher pour les mesures de probabilité dont le support est l'ensemble des entiers non-négatifs. Cette information possède des propriétés similaires à l'information de Fisher et donne deux caractérisations différentes de la loi de Poisson. Ceci conduit à une caractérisation des suites de mesures de probabilité ayant comme points d'accumulation des lois de Poisson.
Une partie de ce travail a été complété lorsque les auteurs étaient membres du MSRI à Berkeley. Les auteurs sont reconnaissants aux NSF et CRSNG.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Références
Barron, A.R., Entropy and the central limit theorem, Annals of Probability 14, (1986), 336–342.
Bickel, P.J. et Freedman, D.A., Some asymptotic theory for the bootstrap, Ann. Statist. 9 (1981), 1196–1217.
Brown, L.D., A proof of the central limit theorem motivated by the Cramer-Rao Inequality, Statistics and Probability: Essays in Honor of C.R. Rao, Kallianpur, G. Krishnaian, P.R., Ghosh, J.K., eds. North-Holland (1982), 141–148.
Dobrushin, R.L., Describing a system of random variables by conditional distributions, Theory Probab. Appl. 15 (1970) 458–486.
Dynkin E.B. et Yushekvich, A.A., Markov Processes; Theorems and Problems, Plenum Press, New York (1969).
Huber, P.J., Robust Statistics, John Wiley and Sons (1981).
Kendall, D.G., Information theory and the limit theorem for Markov chains and processes with a countable infinity of states, Ann. Inst. Statist. Math. 15, (1964), 137–143.
Johnstone, I., Admissibility, difference equations and recurrence in estimating a Poisson mean, Ann. Statist. 12, (1984), 1173–1198.
Linnik, Y.V., An information-theoretic proof of the central limit theorem with the Lindeberg condition, Theory Probab. Applic. 4 (1959), 288–299.
Loève, M.M., Probability Theory, 3rd ed., Van Nostrand, Princeton (1963).
Mallows, C.L., A note on asymptotic joint normality, Ann. Math. Statist. 43, (1972), 508–515.
McKean, H.P. Jr., Speed of approach to equilibrium for Kac's caricature of a Maxwellian gas, Arch. Rational Mech. Anal. 21 (1967), 343–367.
Parthasarathy, K.R., Introduction to Probability and Measure, Springer, New York, (1977).
Rényi, A., On measures of entropy and information, Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability, University of California Press Berkeley, 1 (1961), 541–561.
Schmidt, E., Über die Charlier-Jordansche Entwicklung einer willkürlichen funktion nach der Poissonschen funktion und ihren Ableitungen, Ztschr. f. angew. Math. und Mech. 13 (1933), 139–142.
Shannon C.E. et Weaver, W., The Mathematical Theory of Communications, Univ. of Illinois Press, Urbana (1949).
Tanaka H., An inequality for a functional of probability distribution and its application to Kac's one-dimensional model of a Maxwellian gas, Z. Warsch. verw. Gebiete 27 (1973), 47–52.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1987 Springer-Verlag
About this paper
Cite this paper
Johnstone, I.M., MacGibbon, B. (1987). Une mesure d'information caracterisant la loi de poisson. In: Azéma, J., Yor, M., Meyer, P.A. (eds) Séminaire de Probabilités XXI. Lecture Notes in Mathematics, vol 1247. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0077656
Download citation
DOI: https://doi.org/10.1007/BFb0077656
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-17768-5
Online ISBN: 978-3-540-47814-0
eBook Packages: Springer Book Archive