## Abstract

Secret Agent 00111 was in a most uncharacteristic mood; he was thinking about his career and remembering details that he was trained to forget. With some particularly sordid exceptions, it was not a story of universal appeal. However, as he neared the end of his service, he had been approached with several financial offers for the secrets of his legendary success. The process was all wrong, he thought glumly. The true secret of his success was more second nature than a mathematical equation, and it was probably not so salable as his backers believed. Oh well, he could always lie. . . . However, since he had been asked, he pondered what precisely he had bought and brought to the field of espionage.

## Keywords

Mutual Information Event Sequence State Diagram Entropy Function Transition Probability Matrix## Preview

Unable to display preview. Download preview PDF.

## References

- N. M. Abramson. 1963.
*Information Theory and Coding*. McGraw-Hill, New York.Google Scholar - A. V. Aho and J. D. Ullman. 1968. “The Theory of Languages.”
*Math. Systems Theory*2: 97125.MathSciNetCrossRefGoogle Scholar - R. L. Adler, D. Coppersmith, M. Hasner. 1983. “Algorithms for Sliding Block Codes-An Application of Symbol Dynamics to Information Theory.”
*IEEE Trans. Inform. Theory*: IT29: 5–22, 1983: 5–22.MathSciNetMATHCrossRefGoogle Scholar - R. B. Ash. 1965.
*Information Theory*. Interscience, New York.MATHGoogle Scholar - P. Billingsley. 1965.
*Ergodic Theory and Information*. Wiley, New York.MATHGoogle Scholar - R. E. Blahut. 1987.
*Principles and Practice of Information Theory*. Addison Wesley, New York.MATHGoogle Scholar - L. Breiman. 1957. “The Individual Ergodic Theory of Information Theory.”
*Ann. Math. Stat*. 28: 809–11; errata in*Ann. Math. Stat*. 31: 809–10.MathSciNetMATHCrossRefGoogle Scholar - G. Burton and J. C. R. Licklider. 1955. “Long-Range Constraints in the Statistical Structure of Printed English.”
*Amer. Jour. Psych*. 68: 650–53.CrossRefGoogle Scholar - N. Chomsky. 1956. “Three Models for the Description of Languages.”
*IEEE Trans. Inform. Theory*2: 113–24.MATHCrossRefGoogle Scholar - N. Chomsky 1959. “On Certain Formal Properties of Grammars.”
*Inf. Contr*. 2: 137–67.MathSciNetMATHCrossRefGoogle Scholar - N. Chomsky1969.
*Aspects of the Theory of Syntax*. MIT Press, Cambridge, MA.Google Scholar - T. M. Cover and J. A. Thomas. 1991.
*Elements of Information Theory*. Wiley, New York.MATHCrossRefGoogle Scholar - I. Csiszar and T. Korner. 1981.
*Information Theory: Coding Theorems for Discrete Memoryless Systems*. Academic Press, New York.MATHGoogle Scholar - R. M. Fano. 1961.
*Transmission of Information: A Statistical Theory of Communication*, MIT Press and Wiley, New York.Google Scholar - A. Feinstein. 1958.
*Foundations of Information Theory*. New York: McGraw-Hill.MATHGoogle Scholar - W. Feller. 1950.
*An Introduction to Probability Theory and Its Applications*. Vol. 1. Wiley, New York.MATHGoogle Scholar - J. A. Fodor and J. Katz. 1964.
*The Structure of Language*. Prentice-Hall.Google Scholar - R. G. Gallager. 1968.
*Information Theory and Reliable Communication*. Wiley, New York.MATHGoogle Scholar - S. W. Golomb. 1967.
*Shift Register Sequences*. Holden-Day, San Francisco. Revised edition, Aegean Park Press, Laguna Hills, California, 1982.MATHGoogle Scholar - S. Guiasu. 1976.
*Information Theory with Applications*. McGraw-Hill, New York.Google Scholar - R. V. L. Hartley. 1928. “Transmission of Information,”
*Bell Sys. Tech. J*. 7: 535–63.Google Scholar - W. J. Hurd. 1965. “Coding for English As a Second-Order Markov source with New Trigram Statistics.” Report No. 24. Electrical Engineering Dept., University of Southern California.Google Scholar
- F. Jelinek. 1968.
*Probabilistic Information Theory: Discrete and Memoryless Models*. McGrawHill, New York.MATHGoogle Scholar - S. Karlin and H. M. Taylor. 1975.
*A First Course in Stochastic Processes*. Academic Press, San Diego, CA.MATHGoogle Scholar - S. Karlin and H. M. Taylor. 1981.
*A Second Course in Stochastic Processes*. Academic Press, San Diego, CA.MATHGoogle Scholar - J. G. Kemeny. J. L. Snell. 1960.
*Finite Markov Chains*. Van Nostrand, Princeton, NJ.MATHGoogle Scholar - A. L. Khintchin. 1953. “The Entropic Concept of Probability.”
*Uspekhi Mat. Nauk*. 8: 3–20.Google Scholar - A. L. Khintchin. 1957.
*Mathematical Foundations of Information Theory*. Dover, New York.Google Scholar - L. Kleinrock. 1975.
*Queueing Systems*. Wiley, New York.MATHGoogle Scholar - S. Kullback. 1959.
*Information Theory and Statistics*. Wiley, New York.MATHGoogle Scholar - H. J. Larson and B. O. Shubert. 1979.
*Probabilistic Models in Engineering Sciences. Vol 2*. Wiley, New York.Google Scholar - P. M. Lee. 1964. “On the Axioms of Information Theory.”
*Ann. Math. Stat*. 35: 415–18.MATHCrossRefGoogle Scholar - B. Mcmillan. 1953. “The Basic Theorems of Information Theory.”
*Ann. Math. Stat*. 24: 196–219.MathSciNetMATHCrossRefGoogle Scholar - M. Mansuripur. 1987.
*Introduction to Information Theory*. Prentice Hall, Englewood Cliffs, NJ.Google Scholar - R. J. McEliece. 1977.
*The Theory of Information and Coding*. Addison Wesley. ,Reading MA.MATHGoogle Scholar - A. Papoulis. 1965.
*Probability, Random Variables, and Stochastic Processes*. McGraw-Hill, New York.MATHGoogle Scholar - E. Parzen. 1962.
*Stochastic Processes*. Holden-Day, San Francisco, CA.MATHGoogle Scholar - M. S. Pinsker. 1964.
*Information and Information Stability ofRandom Variables and Processes*. Holden-Day, San Francisco, CA.Google Scholar - N. U. Prabhu. 1965.
*Stochastic Processes*. Holden-Day, San Francisco, CA.Google Scholar - A. Renyi. 1961. “On Measures of Entropy and Information.”
*Fourth Berkeley Symposium on Math. Stat. and Prob*. 1: 547–61.MathSciNetGoogle Scholar - 1965. “On the Foundations of Information Theory.”
*RISI*33, no. 1: pp. 1–14.Google Scholar - F. M. Reza. 1961.
*An Introduction to Information Theory*. McGraw-Hill, New York.Google Scholar - C. E. Shannon. 1948. “A Mathematical Theory of Communication.”
*BSTJ*27: 379–423, 62456.MathSciNetMATHGoogle Scholar - C. E. Shannon. 1951. “Prediction and Entropy of Printed English.”
*BSTJ*30: 50–65.MATHGoogle Scholar - G. Strang. 1988.
*LinearAlgebra and ItsApplications*. 3d ed. Harcourt Brace Jovanovich, Orlando, FL.Google Scholar - H. Tverberg.,1958. “A New Derivation of the Information Function.”
*Math. Scand*. 6: 297–98.MathSciNetMATHGoogle Scholar - N. Wiener. 1948.
*Cybernetics*. Wiley.Google Scholar - N. Wiener. 1949.
*Extrapolation, Interpolation, and Smoothing ofStationary Time Series*. MIT Press, Cambridge, MA, and Wiley, New York.Google Scholar - J. Wolfowitz. 1961.
*Coding Theorems of Information Theory*. Springer: Berlin-Heidelberg.MATHGoogle Scholar