On a Riemannian approach to the order α relative entropy
To generalize Shannon’s entropy, A. Rényi  introduced order a entropy in 1960. Also in 1967, J. Havrda and F. Charvát  gave another generalization of Shannon’s entropy. Moreover A. Rényi  generalized the mutual information of random variables to order α relative entropy. Also N. Muraki and T. Kawaguchi  exhibited order α relative entropy based on Havrda-Charvát’s order α entropy in 1987 in the same way as an extension of Rényi’s order α relative entropy. These relative entropies are discriminating amount of difference between two distinct probability distributions, it is well known as the divergence in statistics. I. Csiszár  defined f-divergence by the generalization of Kullback-Leibler’s I-divergence  making use of an arbitrary convex function f defined on (0, ∞). On the other hands, J. Burbea and C.R. Rao made Kα-divergence by substituting Havrda-Charvat’s entropy in the Φ-entropy function which was defined on stochastic spaces by them . Also we defined other divergences from a different standpoint . These divergences do not satisfy the axiom of distance.
KeywordsProbability Density Function Mutual Information Divergence Measure Relative Entropy Entropy Function
Unable to display preview. Download preview PDF.
- 1.A. Rényi: On measures of entropy and information, Proc. 4th Berkeley Symposium. Math. Statist, and Probability, 1(1960), pp. 547–561, Berkeley.Google Scholar