Advertisement

On different characterizations of entropies

  • J. Aczel
Conference paper
Part of the Lecture Notes in Mathematics book series (LNM, volume 89)

Keywords

Shannon Entropy Characterization Theorem Renyi Entropy Edinburgh Math Arbitrary Permutation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. [1]
    J. Aczél, Zur gemeinsamen Charakterisierung der Entropien α-ter Ordnung und der Shannonschen Entropie nicht unbedingt vollständiger Verteilungen. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 3(1964), 177–183.MathSciNetCrossRefzbMATHGoogle Scholar
  2. [2]
    J. Aczél—Z. Daróczy, Charakterisierung der Entropien positiver Ordnung und der Shannonschen Entropie. Acta Math. Acad. Sci. Hungar. 14(1963), 95–121.MathSciNetCrossRefzbMATHGoogle Scholar
  3. [3]
    J. Aczél—Z. Daróczy, Sur la caractérisation axiomatique des entropies d’ordre positif, y comprise l’entropie de Shannon. C. R. Acad. Sci. Paris 257(1963), 1581–1584.MathSciNetzbMATHGoogle Scholar
  4. [4]
    J. Aczél—Z. Daróczy, Über verallgemeinerte quasilineare Mittelwerte, die mit Gewichtsfunktionen gebildet sind. Publ. Math. Debrecen 10(1963), 171–190.MathSciNetzbMATHGoogle Scholar
  5. [5]
    J. Aczél—J. Pfanzagl, Remarks on the Measurement of Subjective Probability and Information. Metrika 11(1966), 91–105.MathSciNetCrossRefzbMATHGoogle Scholar
  6. [6]
    L. Baiocchi, Su un sistema di equazioni funzionali connesso alla teoria dell’informazione. Boll. Un. Mat. Ital. (2)22(1967), 236–246.MathSciNetzbMATHGoogle Scholar
  7. [7]
    R. Borges, Zur Herleitung der Shannonschen Information. Math. Z. 96(1967), 282–287.MathSciNetCrossRefzbMATHGoogle Scholar
  8. [8]
    L.L. Campbell, A Coding Theorem and Rényi’s Entropy. Information and Control 8(1965), 423–429.MathSciNetCrossRefzbMATHGoogle Scholar
  9. [9]
    L.L. Campbell, Definition of Entropy by Means of a Coding Problem. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 6(1966), 113–118.MathSciNetCrossRefzbMATHGoogle Scholar
  10. [10]
    T.W. Chaundy—J.B. McLeod, On a Functional Equation. Proc. Edinburgh Math. Soc. Edinburgh Math. Notes 43(1960), 7–8.MathSciNetCrossRefzbMATHGoogle Scholar
  11. [11]
    Z. Daróczy, Über die gemeinsame Charakterisierung der zu den nicht vollständigen Verteilungen gehörigen Entropien von Shannon und Rényi. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 1(1963), 381–388.MathSciNetCrossRefzbMATHGoogle Scholar
  12. [12]
    Z. Daróczy, Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeits-verteilungen. Acta Math. Acad. Sci. Hungar. 15(1964), 203–210.MathSciNetCrossRefzbMATHGoogle Scholar
  13. [13]
    Z. Daróczy, Über eine Charakterisierung der Shannonschen Entropie. Statistica 27 (1967), 189–205.Google Scholar
  14. [14]
    Z. Daróczy, Über ein Funktionalgleichungssystem der Informationstheorie. Manuskript.Google Scholar
  15. [15]
    Z. Daróczy, Über die Charakterisierung der Shannonschen Entropie. Manuskript.Google Scholar
  16. [16]
    Z. Daróczy, On the Shannon Measure of Information (Hungarian). Manuscript.Google Scholar
  17. [17]
    D.K. Faddeev, On the Concept of Entropy of a Finite Probabilistic Scheme (Russian). Uspehi Mat. Nauk 11(1956), No. 1(67), 227–231.MathSciNetzbMATHGoogle Scholar
  18. [18]
    R.S. Ingarden, A Simplified Axiomatic Definition of Information. Bull. Acad. Polon. Sci. Sér. Sci. Math. Astronom. Phys. 11(1963), 209–212.MathSciNetGoogle Scholar
  19. [19]
    R. S. Ingarden, Simplified Axioms for Information without Probability. Prace Mat. 9(1965), 273–282.MathSciNetzbMATHGoogle Scholar
  20. [20]
    R.S. Ingarden—K. Urbanik, Information as Fundamental Notion of Statistical Physics. Bull. Acad. Polon. Sci. Sér. Sci. Math. Astronom. Phys. 9(1961), 313–316.MathSciNetzbMATHGoogle Scholar
  21. [21]
    R.S. Ingarden—K. Urbanik, Information without Probability. Colloq. Math. 9(1962), 131–150.MathSciNetzbMATHGoogle Scholar
  22. [22]
    J. Kampé de Feriet—B. Forte, Information et probabilité. C. R. Acad. Sci. Paris 265(1967), A 110–A 114 A 142–A 146, A 350–A 353.zbMATHGoogle Scholar
  23. [23]
    D. G. Kendall, Functional Equations in Information Theory. Z. Wahrsheinlichkeitstheorie und Verw. Gebiete 2(1963), 225–229.MathSciNetCrossRefzbMATHGoogle Scholar
  24. [24]
    A.J. Khinchin, The Concept of Entropy in the Theory of Probability (Russian). Uspehi Mat. Nauk 8(1953), no. 3(55), 3–20.MathSciNetzbMATHGoogle Scholar
  25. [25]
    P.M. Lee, On the Axioms of Information Theory. Ann. Math. Statist. 35(1964), 414–441.MathSciNetCrossRefGoogle Scholar
  26. [26]
    N. Pintacuda, Shannon Entropy: A More General Derivation. Statistica 26(1966), 509–524.MathSciNetGoogle Scholar
  27. [27]
    A. Rényi, On Measures of Entropy and Information. Proc. 4th Berkeley Symp. Math. Statist. and Probability 1960, Univ. of Calif. Press, Berkeley, Calif. 1961, Vol. I. 547–561.zbMATHGoogle Scholar
  28. [28]
    A. Rényi, Letter to J. Aczel, May 31, 1965.Google Scholar
  29. [29]
    C.E. Shannon, A Mathematical Theory of Communication. Bell System Tech. J. 27 (1948), 379–423, 623–656.MathSciNetCrossRefzbMATHGoogle Scholar
  30. [30]
    H. Tverberg, A New Derivation of the Information Function. Math. Scand. 6(1958), 297–298.MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer-Verlag 1969

Authors and Affiliations

  • J. Aczel
    • 1
  1. 1.University of Waterloo

Personalised recommendations