Abstract
Information theory is becoming more and more important for many fields. This is true for engineering- and technology-based areas but also for more theoretically oriented sciences such as probability and statistics.
Research supported by INTAS, project 00-738, and by the Danish Natural Science Research Council.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
S. I. Amari, Information geometry on hierarchy of probability distributions, IEEE Trans. Inform. Theory, 47 (2001), 1701–1711.
D. Applebaum, Probability and Information. An integrated approach, Cambridge Univ. Press (Cambridge, 1996).
J. P. Aubin, Optima and equilibria. An introduction to nonlinear analysis, Springer (Berlin, 1993).
A. R. Barron, Entropy and the central limit theorem, Ann. Probab., 14 (1) (1986), 336–342.
A. R. Barron and O. Johnson, Fisher information inequalities and the central limit theorem, submitted for publication, Probab. Theory Relat. Fields, 129 (2004), 391–409.
F. Bellini and M. Frittelli, On the existence of minimax martingale measures, Mathematical Finance, 12 (2002), 1–21.
L. M. Bregman, The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming, USSR Comput. Math. and Math. Phys., 7 (1967), 200–217. Translated from Russian.
M. Broom, Using game theory to model the evolution of information: An illustrative game, Entropy, 4 (2002), 35–46. Online at http://www.unibas.ch/mdpi/entropy/.
N. N. Čencov, A nonsymmetric distance between probability distributions, entropy and the Pythagorean theorem. Math. Zametki, 4 (1968), 323–332 (in Russian).
N. N. Čencov, Statistical decision rules and optimal inference, Nauka (Moscow, 1972), in Russian, translation in “Translations of Mathematical Monographs”, 53. American Mathematical Society (1982).
T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley (New York, 1991).
I. Csiszár, Informationstheoretische Konvergenzbegriffe im Raum der Wahrscheinlichkeitverteilung, A Magyar Tudományos Akadémia Matematikai Kutató Intézetének Közleményei, 7 (1962), 137–158.
I. Csiszár, A class of measures of informativity of observation channels, Period. Math. Hungar., 2 (1972), 191–213.
I. Csiszár, I-divergence geometry of probability distributions and minimization problems, Ann. Probab., 3 (1975), 146–158.
I. Csiszár, Sanov property, generalized I-projection and a conditional limit theorem, Ann. Probab., 12 (1984), 768–793.
I. Csiszár, Why least squares and maximum entropy? an axiomatic approach to inference for linear inverse problems, Ann. Stat., 19 (1991), 2032–2066.
I. Csiszár and J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Academic (New York, 1981).
I. Csiszár and F. Matúš, Convex cores of measures on ℝd, Stud. Sci. Math. Hungar., 38 (2001), 177–190.
I. Csiszár and F. Matúš, Information projections revisited, IEEE Trans. Inform. Theory, 49 (2003), 1474–1490.
L. D. Davisson and A. Leon-Garcia, A source matching approach to finding minimax codes, IEEE Trans. Inform. Theory, 26 (1980), 166–174.
F. Delbaen, P. Grandits, T. Rheinlaender, D. Samperi, M. Schweizer and C. Stricker, Exponential hedging and entropic penalties, Mathematical Finance, 12 (2002), 99–123.
B. de Finetti, Theory of Probability, Wiley (London, 1974). Italian original 1970.
A. Dembo and O. Zeitouni, Large Deviations Techniques and Applications. Jones and Bartlett Publishers International, Boston, 1993.
M. J. Donald, On the relative entropy. Commun. Math. Phys., 105 (1985), 13–34.
Z. Drezner and H. Hamacher, editors, Facility location. Applications and Theory, Springer (Berlin, 2002).
P. D. Grünwald and A. P. Dawid, Game theory, maximum entropy, minimum discrepancy, and robust bayesian decision theory, Ann. Stat., 32 (2004), 1367–1433.
P. Harremoës, Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inform. Theory, 47 (5) (July 2001), 2039–2041.
P. Harremoës, The Information Topology, in: Proceedings IEEE International Symposium on Information Theory, IEEE (2002), p. 431.
P. Harremoës, Information Topologies with Applications (2003), in this volume, pp. 113–150.
P. Harremoës and F. Topsøe, Unified approach to optimization techniques in Shannon theory, in: Proceedings, 2002 IEEE International Symposium on Information Theory, IEEE (2002), p. 238.
P. Harremoës and F. Topsøe, Maximum entropy fundamentals. Entropy, 3 (Sept. 2001), 191–226, http://www.unibas.ch/mdpi/entropy/ [online].
D. Haussler, A general minimax result for relative entropy, IEEE Trans. Inform. Theory, 43 (1997), 1276–1280.
A. S. Holevo, Statistical Structure of Quantum Theory, Springer (Berlin, 2001).
E. T. Jaynes, Webpage maintained by L. Brethorst, dedicated to Jaynes work, available online from http://bayes.wustl.edu.
E. T. Jaynes, Information theory and statistical mechanics, I and II, Physical Reviews, 106 and 108 (1957), 620–630 and 171–190.
E. T. Jaynes, Clearing up mysteries — the original goal, in: J Skilling, editor, Maximum Entropy and Bayesian Methods, Kluwer (Dordrecht, 1989).
E. T. Jaynes, Probability Theory — The Logic of Science, Cambridge University Press (Cambridge, 2003).
A. Jessop, Informed Assessments, an Introduction to Information, Entropy and Statistics, Ellis Horwood (New York, 1995).
J. N. Kapur, Maximum Entropy Models in Science and Engineering, Wiley (New York, 1993), first edition 1989.
D. Kazakos, Robust noiceless source coding through a game theoretic approach, IEEE Trans. Inform. Theory, 29 (1983), 577–583.
J. L. Kelly, A new interpretation of information rate, Bell System Technical Journal, 35 (1956), 917–926.
J. Kisynski, Convergence du typé 1, Colloq. Math., 7 (1960), 205–211.
S. Kullback, Informaton Theory and Statistics, Wiley (New York, 1959).
S. Kullback and R. Leibler, On information and sufficiency, Ann. Math. Statist., 22 (1951), 79–86.
Yu. V. Linnik, An information-theoretic proof of the central limit theorem with Lindeberg condition, Theory Probab. Appl., 4 (1959), 288–299.
M. Ohya and D. Petz, Quantum Entropy and Its Use, Springer (Berlin, Heidelberg, New York, 1993).
E. Pfaffelhuber, Minimax information gain and minimum discrimination principle, in: I. Csiszár and P. Elias, editors, Topics in Information Theory, volume 16 of Colloquia Mathematica Societatis János Bolyai, János Bolyai Mathematical Society and North-Holland (1977), pp. 493–519.
J. Rissanen, A. Barron and B. Yu, The minimum description length principle in coding and modeling, IEEE Trans. Inform. Theory, 44 (1998), 2743–2760.
B. Ya. Ryabko, Comments on “a source matching approach to finding minimax codes”, IEEE Trans. Inform. Theory, 27 (1981), 780–781. Including also the ensuing Editor’s Note.
G. Shafer and V. Vovk, Probability and finance. It’s only a game! Wiley (Chichester, 2001).
C. E. Shannon, A mathematical theory of communication. Bell Syst. Tech. J., 27 (1948), 379–423 and 623–656.
P. D. Straffin, Game Theory and Strategy, volume 36 of New Mathematical Libary. Mathematical Ass. of America, 1993.
J. J. Sylvester, A question in the geometry of situation, Quarterly Journal of Pure and Applied Mathematics, 1 (1857), 79.
F. Topsøe, An information theoretical identity and a problem involving capacity. Studia Scientiarum Mathematicarum Hungarica, 2 (1967), 291–292.
F. Topsøe, A new proof of a result concerning computation of the capacity for a discrete channel, Z. Wahrscheinlichkeitstheorie verw. Geb., 22 (1972), 166–168.
F. Topsøe, Information theoretical optimization techniques. Kybernetika, 15 (1979), 8–27.
F. Topsøe, Game theoretical equilibrium, maximum entropy and minimum information discrimination, in: A. Mohammad-Djafari and G. Demoments, editors, Maximum Entropy and Bayesian Methods, Kluwer Academic Publishers (Dordrecht, Boston, London, 1993), pp. 15–23.
F. Topsøe, Basic concepts, identities and inequalities — the toolkit of information theory, Entropy, 3 (2001), 162–190. http://www.unibas.ch/mdpi/entropy/ [online].
F. Topsøe, Maximum entropy versus minimum risk and applications to some classical discrete distributions, IEEE Trans. Inform. Theory, 48 (2002), 2368–2376.
J. von Neumann and O. Morgenstern, Theory of Games and Economic Behavior, Princeton University Press (Princeton, 1947), 2nd. edition.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 János Bolyai Mathematical Society and Springer-Verlag
About this chapter
Cite this chapter
Topsøe, F. (2007). Information Theory at the Service of Science. In: Csiszár, I., Katona, G.O.H., Tardos, G., Wiener, G. (eds) Entropy, Search, Complexity. Bolyai Society Mathematical Studies, vol 16. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-32777-6_8
Download citation
DOI: https://doi.org/10.1007/978-3-540-32777-6_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-32573-4
Online ISBN: 978-3-540-32777-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)