A Combinatorial Problem Arising in Information Theory: Precise Minimax Redundancy for Markov Sources
Redundancy of a code is defined as the excess of the code length over the optimal code length. When the source of information is unknown, then one wants to design the best code for the worst source (within the class of sources that are being considered). This is called the minimax redundancy. It can come in two flavors: either on average or the worst case. The latter is known as the maximal minimax redundancy, and it is studied in this paper for Markovian sources. Surprisingly, this problem led us to an interesting combinatorial problem on directed graphs that we shall solve using analytic tools. To be more precise, we need to count the number of Eulerian cycles in a directed multi-graph. The maximal minimax redundancy turns out to be a sum over such Eulerian paths. In particular, we shall prove that the maximal minimax redundancy for Markov sources of order r is asymptotically equal to 1/2 mr(m-1) log n + log Am + O(1/n), where n is the length of source sequences, m is the size of the alphabet and Am is an explicit constant that depends on m.*
KeywordsFrequency Count Code Length Finite Alphabet Minimax Regret Universal Code
Unable to display preview. Download preview PDF.
- A. Dembo and I Kontoyiannis, Critical Behvaior in Lossy Coding, IEEE Trans. Inform. Theory, March 2001.Google Scholar
- M. Drmota and W. Szpankowski, Generalized Shannon Code Minimizes the Maximal Redundancy, Proc. LATIN 2002, Cancun, Mexico, 2002.Google Scholar
- P. Flajolet and W. Szpankowski, Analytic Variations on Redundancy Rates of Renewal Processes, 2000 International Symposium on Information Theory, pp. 499, Sorento, Italy, June 2000; also INRIA RR No. 3553, 1998. 1995.Google Scholar
- Q. Xie, A. Barron, Minimax Redundancy for the Class of Memoryless Sources, IEEE Trans. Information Theory, 43, 647–657, 1997.Google Scholar