Advertisement

On a Riemannian approach to the order α relative entropy

  • M. Miyata
  • K. Kato
  • M. Yamada
  • T. Kawaguchi
Part of the Mathematics and Its Applications book series (MAIA, volume 350)

Abstract

To generalize Shannon’s entropy, A. Rényi [1] introduced order a entropy in 1960. Also in 1967, J. Havrda and F. Charvát [2] gave another generalization of Shannon’s entropy. Moreover A. Rényi [1] generalized the mutual information of random variables to order α relative entropy. Also N. Muraki and T. Kawaguchi [3] exhibited order α relative entropy based on Havrda-Charvát’s order α entropy in 1987 in the same way as an extension of Rényi’s order α relative entropy. These relative entropies are discriminating amount of difference between two distinct probability distributions, it is well known as the divergence in statistics. I. Csiszár [4] defined f-divergence by the generalization of Kullback-Leibler’s I-divergence [5] making use of an arbitrary convex function f defined on (0, ∞). On the other hands, J. Burbea and C.R. Rao made Kα-divergence by substituting Havrda-Charvat’s entropy in the Φ-entropy function which was defined on stochastic spaces by them [6]. Also we defined other divergences from a different standpoint [7]. These divergences do not satisfy the axiom of distance.

Keywords

Probability Density Function Mutual Information Divergence Measure Relative Entropy Entropy Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    A. Rényi: On measures of entropy and information, Proc. 4th Berkeley Symposium. Math. Statist, and Probability, 1(1960), pp. 547–561, Berkeley.Google Scholar
  2. 2.
    J. Havrda and F. Charvát: Quatification method of classification process: concept of structural a-entropy, Kybernetika, 3(1967), pp. 30–35.MathSciNetzbMATHGoogle Scholar
  3. 3.
    N. Muraki and T. Kawaguchi: On a generalizition of Havrda-Charvát’s α-entropy to relative α-entropy and its properties in a continuous system, Tensor, N. 5., 46(1987), pp. 154–167.zbMATHGoogle Scholar
  4. 4.
    I. Csiszár: A class of measures of informativity of observation channels, Periodica Mathemaiica Hungarica, 2(1-4)(1972), pp. 191–213.zbMATHCrossRefGoogle Scholar
  5. 5.
    S. Kullback and R. A. Leibler: On information and sufficiency, Ann. Math. Statist., 22(1951), pp. 79–167.MathSciNetzbMATHCrossRefGoogle Scholar
  6. 6.
    J. Burbea and C. R. Rao: Entropy differential metric, distance and divergence measure in probability spaces: a unified approach, Journal of Multivariate Analysis, 12(1982), pp. 575–596.MathSciNetzbMATHCrossRefGoogle Scholar
  7. 7.
    M. Miyata, M. Yamada, S. Kohgo and T. Kawaguchi: On the relation between the statistical divergence and geodesic distance, Reports on Mathematical Physics, 32-3(1993), pp. 269–278.MathSciNetzbMATHCrossRefGoogle Scholar

Copyright information

© Kluwer Academic Publishers 1996

Authors and Affiliations

  • M. Miyata
    • 1
  • K. Kato
    • 1
  • M. Yamada
    • 1
  • T. Kawaguchi
    • 2
  1. 1.Deparment of Information SciencesSaitama CollegeKazo-shi 347Japan
  2. 2.Institute of Information Sciences and ElectronicsUniversity of TsukubaTsukuba-shi 305Japan

Personalised recommendations