Abstract
This paper presents and discusses two generalized forms of the Shannon entropy, as well as a generalized information measure. These measures are applied on a exponential-power generalization of the usual Normal distribution, emerged from a generalized form of the Fisher’s entropy type information measure, essential to Cryptology. Information divergences between these random variables are also discussed. Moreover, a complexity measure, related to the generalized Shannon entropy, is also presented, extending the known SDL complexity measure.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Bauer, L.F.: Kryptologie, Methoden and Maximen. Springer, London (1994)
Blachman, N.M.: The convolution inequality for entropy powers. IEEE Trans. Inf. Theory 11(2), 267–271 (1965)
Carlen, E.A.: Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. J. Funct. Anal. 101, 194–211 (1991)
Cotsiolis, A., Tavoularis, N.K.: On logarithmic Sobolev inequalities for higher order fractional derivatives. C.R. Acad. Sci. Paris Ser. I 340, 205–208 (2005)
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, Hoboken (2006)
Del Pino, M., Dolbeault, J., Gentil, I.: Nonlinear diffusions, hypercontractivity and the optimal L p-Euclidean logarithmic Sobolev inequality. J. Math. Anal. Appl. 293(2), 375–388 (2004)
Feldman, D.P., Crutchfield, J.P.: Measures of statistical complexity: why? Phys. Lett. A 3, 244–252 (1988)
Ferentinos, K., Papaioannou, T.: New parametric measures of information. Inf. Control 51, 193–208 (1981)
Fisher, R.A.: On the mathematical foundation of theoretical statistics. Philos. Trans. R. Soc. Lond. A 222, 309–368 (1922)
Gómez, E., Gómez–Villegas, M.A., Marin, J.M.: A multivariate generalization of the power exponential family of distributions. Commun. Stat. Theory Methods 27(3), 589–600 (1998)
Goodman, I.R., Kotz, S.: Multivariate \(\theta\)-generalized normal distributions. J. Multivar. Anal. 3, 204–219 (1973)
Gradshteyn, I.S., Ryzhik, I.M.: Table of Integrals, Series, and Products. Elsevier, Amsterdam (2007)
Gross, L.: Logarithm Sobolev inequalities. Am. J. Math. 97(761), 1061–1083 (1975)
Katzan, H. Jr.: The Standard Data Encryption Algorithm. Petrocelli Books, Princeton, NJ (1977)
Kitsos, C.P., Tavoularis, N.K.: Logarithmic Sobolev inequalities for information measures. IEEE Trans. Inf. Theory 55(6), 2554–2561 (2009)
Kitsos, C.P., Toulias, T.L.: New information measures for the generalized normal distribution. Information 1, 13–27 (2010)
Kitsos, C.P., Toulias, T.L.: An entropy type measure of complexity. In: Proceedings of COMPSTAT 2012, pp. 403–415 (2012)
Kitsos, C.P., Toulias, T.L.: Bounds for the generalized entropy-type information measure. J. Commun. Comput. 9(1), 56–64 (2012)
Kitsos, C.P., Toulias, T.L.: Inequalities for the Fisher’s information measures. In: Rassias, T.M. (ed.) Handbook of Functional Equations: Functional Inequalities, Springer Optimization and Its Applications, vol. 95, pp. 281–313. Springer, New York (2014)
Kitsos, C.P., Toulias, T.L., Trandafir, C.P.: On the multivariate γ-ordered normal distribution. Far East J. Theor. Stat. 38(1), 49–73 (2012)
Kotz, S.: Multivariate distribution at a cross-road. In: Patil, G.P., Kotz, S., Ord, J.F. (eds.) Statistical Distributions in Scientific Work, vol. 1, pp. 247–270. D. Reidel, Dordrecht (1975)
Nadarajah, S.: The Kotz type distribution with applications. Statistics 37(4), 341–358 (2003)
Nadarajah, S.: A generalized normal distribution. J. Appl. Stat. 32(7), 685–694 (2005)
Piasecki, R., Plastino, A.: Entropic descriptor of a complex behaviour. Phys. A 389(3), 397–407 (2010)
Rényi, A.: On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561. University of California Press, Berkeley (1961)
Rényi, A.: Probability Theory. North-Holland (Ser. Appl. Math. Mech.), Amsterdam (1970)
Rosso, O.A., Martin, M.T., Plastino, A.: Brain electrical activity analysis using wavelet-based informational tools (II): Tsallis non-extensivity and complexity measures. Phys. A 320, 497–511 (2003)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423, 623–656 (1948)
Shiner, J.S., Davison, M., Landsberg, P.T.: Simple measure for complexity. Phys. Rev. E 59(2), 1459–1464 (1999)
Sobolev, S.: On a theorem of functional analysis. AMS Transl. Ser. 2 (English Translation) 34, 39–68 (1963)
Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 2, 255–269 (1959)
Stinson, D.R.: Cryptography: Theory and Practice, 3rd edn. CRC Press, Boca Raton (2006)
Vajda, I.: \(\mathcal{X}^{2}\)-divergence and generalized Fisher’s information. In: Transactions of the 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, pp. 873–886 (1973)
Weissler, F.B.: Logarithmic Sobolev inequalities for the heat-diffusion semigroup. Trans. Am. Math. Soc. 237, 255–269 (1963)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Toulias, T.L., Kitsos, C.P. (2015). Generalizations of Entropy and Information Measures. In: Daras, N., Rassias, M. (eds) Computation, Cryptography, and Network Security. Springer, Cham. https://doi.org/10.1007/978-3-319-18275-9_22
Download citation
DOI: https://doi.org/10.1007/978-3-319-18275-9_22
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-18274-2
Online ISBN: 978-3-319-18275-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)