Abstract
Information Theory has been created by Claude Shannon as a mathematical theory of communication. His fundamental paper {19} appeared in 1948. This was one of the major discoveries of the 20th century, establishing theoretical foundations for communication engineering and information technology. The key ingredients of Shannon’s work were (i) a stochastic model of communication, (ii) the view of information as a commodity whose amount can be measured without regard to meaning, and (iii) the emphasis of coding as a means to enhance information storage and transmission, in particular, to achieve reliable transmission over unreliable channels.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Aczél, János — Daróczy, Zoltán, On Measures of Information and their Characterizations Academic Press (New York, 1975).
Kullback, Solomon, Information Theory and Statistics, Wiley (New York, 1959), Dover (New York, 1978).
Liggett, Thomas M., Interacting Particle Systems, Die Grundlehren der Mathematischen Wissenschaften in Einzeldarstellungen, Band 276, Springer-Verlag (New York, 1985).
Rényi, Alfréd, Selected Papers, ed. Pál Turán, Akadémiai Kiadó (Budapest, 1976).
Rényi, Alfréd, Probability Theory, Translated by László Vekerdi, North-Holland Series in Applied Mathematics and Mechanics, Vol. 10, North-Holland Publishing Company (Amsterdam-London); American Elsevier Publishing Co., Inc. (New York, 1970).
Reza, Fazlollah M., An Introduction to Information Theory, McGraw-Hill (New York, 1961).
Vajda, Igor, Theory of Statistical Inference and Information, Kluwer Academic (Boston, 1989).
Wald, Abraham, Sequential Analysis, John Wiley and Sons (New York) — Chapman and Hall (London, 1947).
{1} A. Barron, Antropy and the central limit theorem, Annals of Probability, 14 (1986), 336–342.
{2} L. Campbell, A coding theorem and Rényi’s entropy, Information and Control, 8 (1965), 423–429.
{3} I. Csiszár, Eine inrofmationtheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen ketten, Publ. Math. Inst. Hungar. Acad. Sci., 8 (1963), 85–108.
{4} I. Csiszár, Generalized entropy and quantization problems, Trans. Sixth Prague Conference on Inform. Theory, etc., 1971, Academia (Praha, 1973), 299–318.
{5} I. Csiszár, Generalized cutoff rates and Rényi’s information measures, IEEE Trans. Inform. Theory, 41 (1995), 26–34.
{6} I. Csiszár and J. Fischer, Informationsentfernungen im Raum der Wahrscheinlichkeitsverteilungen, Publ. Math. Inst. Hungar. Acad. Sci., 7 (1962), 159–182.
{7} Z. Daróczy, Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen, Acta Math. Sci Hungar., 15 (1964), 203–210.
{8} Z. Daróczy and I. Kátai, Additive zahlentheoretische Funktionen und das Mass der information, Ann. Univ. Sci Budapest, See. Math., 13 (1970), 83–88.
{9} P. Erdős, On the distribution function of additive functions, Annals of Math., 17 (1946), 1–20.
{10} J. Fritz, An information-theoretical proof of limit theorems for reversible Markov processes, Trans. Sixth Prague Conference on Inform. Theory, etc., 1971. Academia (Praha, 1973), 183–197.
{11} J. Fritz, An approach to the entropy of point processes, Periodica Math. Hungar., 3 (1973), 73–83.
{12} P. Gács, Hausdorff-dimension and probability distribution, Periodica Math. Hungar., 3 (1973), 59–71.
{13} P. Kafka, F. Österreicher and I. Vincze, On powers of f-divergences defining a distance, Studia Sci Math. Hungar., 26 (1991), 415–422.
{14} D. Kendall, Information theory and the limit-theorem for Markov chains and processes with a countable infinity of states, Annals Inst. Statist. Math., 15 (1963), 137–143.
{15} T. Nemetz, Equivalence-orthogonality dichotomies of probability measures, Limit Theorems of Probability Theory, Colloquia Math. Soc. J. Bolyai, Vol. 11, North Holland (1975), 183–191.
{16} F. Österreicher, The construction of least favourable distributions is traceable to a minimal perimeter problem, Studia Sci Math. Hungar., 17 (1982), 341–351.
{17} M. Puri and I. Vincze, Measure of information and contiguity, Statistics and Probability Letters, 9 (1990), 223–228.
{18} M. Rudemo, Dimension and entropy for a class of stochastic processes, Publ. Math. Inst. Hungar. Acad. Sci., 9 (1964), 73–87.
{19} C. Shannon, A mathematical theory of communication, Bell System Technical Journal, 27 (1948), 379–423 and 623–656.
{20} C. Shannon, Communication in the presence of noice, Proc. IRE, 37 (1949), 10–21.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 János Bolyai Mathematical Society and Springer-Verlag
About this chapter
Cite this chapter
Csiszár, I. (2006). Stochastics: Information Theory. In: Horváth, J. (eds) A Panorama of Hungarian Mathematics in the Twentieth Century I. Bolyai Society Mathematical Studies, vol 14. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30721-1_17
Download citation
DOI: https://doi.org/10.1007/978-3-540-30721-1_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28945-6
Online ISBN: 978-3-540-30721-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)