Abstract
This article surveys application of convex optimization theory to topics in Information Theory. Topics include optimal robust algorithms for hypothesis testing; a fresh look at the relationships between channel coding and robust hypothesis testing; and the structure of optimal input distributions in channel coding.
A key finding is that the optimal distribution achieving channel capacity is typically discrete, and that the distribution achieving an optimal error exponent for rates below capacity is always discrete. We find that the resulting codes significantly out-perform traditional signal constellation schemes such as QAM and PSK.
Supported in part by NSF grant ITR 00-85929.
Supported in part by NSF grant Career 6891730.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
I.C. Abou-Faycal, M.D. Trott, and S. Shamai. The capcity of discrete-time memoryless Rayleigh-fading channels. TIT, 47(4):1290–1301, May 2001.
I.C. Abou-Faycal, M.D. Trott, and S. Shamai. The capacity of discrete-time memoryless Rayleigh-fading channels. IEEE Trans. Inform. Theory, 47(4):1290–1301, 2001.
V. Anantharam. A large deviations approach to error exponents in source coding and hypothesis testing. IEEE Trans. Inform. Theory, 36(4):938–943, 1990.
R.R. Bahadur. Some Limit Theorems in Statistics. SIAM, Philadelphia, PA, 1971.
D.P. Bertsekas. Nonlinear Programming. Athena Scientific, Belmont, MA, 1999.
R.E. blahut. Hypothesis testing and information theory. IEEE Trans. Information Theory, IT-20:405–417, 1974.
R.E Blahut. Principles and Practice of Information Theory. McGraw-Hill, New York, 1995.
J.M. Borwein and A.S. Lewis. A survey of convergence results for maximum entropy. In A. Mohammad-Djafari and G. Demoment, editors, Maximum Entropy and Bayesian Methods, pp. 39–48. Kluwer Academic, Dordrecht, 1993.
S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, Cambridge, 2004.
T.H. Chan, S. Hranilovic, and F.R. Kschischang. Capacity-achieving probability measure for conditionally Gaussian channels with bounded inputs, to appear on IEEE Trans. Inform. Theory, 2004.
Rong-Rong Chen, B. Hajek, R. Koetter, and U. Madhow. On fixed input distributions for noncoherent communication over high SNR Rayleigh fading channels. IEEE Trans. Inform. Theory, 50(12):3390–3396, 2004.
T. Cover and J. Thomas. Elements of Information Theory. Wiley, New York, 1991.
I. Csiszar. Sanov property, generalized I-projection and a conditional limit theorem. Ann. Probab., 12(3):768–793, 1984.
I. Csiszár. The method of types. IEEE Trans. Inform. Theory, 44(6):2505–2523, 1998. Information theory: 1948–1998.
A. Dembo and O. Zeitouni. Large Deviations Techniques And Applications. Springer-Verlag, New York, second edition, 1998.
Paul Dupuis and Richard S. Ellis. A weak convergence approach to the theory of large deviations. Wiley Series in Probability and Statistics: Probability and Statistics. John Wiley & Sons Inc., New York, 1997. A Wiley-Interscience Publication.
S.-C. Fang, J.R. Rajasekera, and H.-S.J. Tsao. Entropy optimization and mathematical programming. International Series in Operations Research & Management Science, 8. Kluwer Academic Publishers, Boston, MA, 1997.
R.G. Gallager. Information Theory and Reliable Communication. Wiley, New York, 1968.
R.G. Gallager. Power limited channels: Coding, multiaccess, and spread spectrum. In R.E. Blahut and R. Koetter, editors, Codes, Graphs, and Systems, pp. 229–257. Kluwer Academic Publishers, Boston, 2002.
J.D. Gibson, R.L. Baker, T. Berger, T. Lookabaugh, and D. Lindbergh. Digital Compression for Multimedia. Morgan Kaufmann Publishers, San Fransisco, CA, 1998.
M.C. Gursoy, H.V. Poor, and S. Verdu. The noncoherent Rician fading channel — part I: Structure of capacity achieving input. IEEE Trans. Wireless Communication (to appear), 2005.
M.C. Gursoy, H.V. Poor, and S. Verdu. The noncoherent Rician fading channel — part II: Spectral efficiency in the low power regime. IEEE Trans. Wireless Communication (to appear), 2005.
W. Hoeffding. Asymptotically optimal tests for multinomial distributions. Ann. Math. Statist, 36:369–408, 1965.
J. Huang. Characterization and computation of optimal distribution for channel coding. PhD thesis, University of Illinois at Urbana-Champaign, Urbana, Illinois, 2004.
J. Huang and S.P. Meyn. Characterization and computation of optimal distribution for channel coding. IEEE Trans. Inform. Theory, 51(7):1–16, 2005.
J. Huang, S.P. Meyn, and M. Medard. Error exponents for channel coding and signal constellation design. Submitted for publication, October 2005.
M. Katz and S. Shamai. On the capacity-achieving distribution of the discretetime non-coherent additive white gaussian noise channel. In Proc. IEEE Int’l. Symp. Inform. Theory, Lausanne, Switzerland, June 30-July 5., p. 165, 2002.
M. Katz and S. Shamai. On the capacity-achieving distribution of the discretetime non-coherent additive white Gaussian noise channel. In 2002 IEEE International Symposium on Information Theory, p. 165, 2002.
S. Kullback. Information Theory and Statistics. Dover Publications Inc., Mineola, NY, 1997. Reprint of the second (1968) edition.
A. Lapidoth and S.M. Moser. Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels. IEEE Trans. Inform. Theory, 49(10), Oct. 2003.
David J. C. Mackay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003. available from http://www.inference.phy.cam.ac.uk/mackay/itila/.
R. Palanki. On the capacity-achieving distributions of some fading channels. Presented at 40th Allerton Conference on Communication, Control, and Computing, 2002.
C. Pandit. Robust Statistical Modeling Based On Moment Classes With Applications to Admission Control, Large Deviations and Hypothesis Testing. PhD thesis, University of Illinois at Urbana Champaign, University of Illinois, Urbana, IL, USA, 2004.
C. Pandit and S.P. Meyn. Worst-case large-deviations with application to queueing and information theory. To appear, Stoch. Proc. Applns., 2005.
C. Pandit, S.P. Meyn, and V.V. Veeravalli. Asymptotic robust Neyman-Pearson testing based on moment classes. In Proceedings of the International Symposium on Information Theory (ISIT), 2004, June 2004.
J. Rissanen. Stochastic Complexity in Statistical Inquiry. World Scientific, Singapore, 1989.
S. Shamai and I. Bar-David. The capacity of average and peak-power-limited quadrature Gaussian channels. IEEE Trans. Inform. Theory, 41(4): 1060–1071, 1995.
J.G. Smith. The information capacity of amplitude and variance-constrained scalar gaussian channels. Inform. Contr., 18:203–219, 1971.
S. Verdu. On channel capacity per unit cost. IEEE Trans. Inform. Theory, 36(5):1019–1030, 1990.
S. Verdu. Spectral efficiency in the wideband regime. IEEE Trans. Inform. Theory, 48(6): 1319–1343, June 2002.
Ofer Zeitouni and Michael Gutman. On universal hypotheses testing via large deviations. IEEE Trans. Inform. Theory, 37(2):285–290, 1991.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Huang, J., Pandit, C., Meyn, S.P., Médard, M., Veeravalli, V. (2007). Entropy, Inference, and Channel Coding. In: Agrawal, P., Fleming, P.J., Zhang, L., Andrews, D.M., Yin, G. (eds) Wireless Communications. The IMA Volumes in Mathematics and its Applications, vol 143. Springer, New York, NY. https://doi.org/10.1007/978-0-387-48945-2_5
Download citation
DOI: https://doi.org/10.1007/978-0-387-48945-2_5
Publisher Name: Springer, New York, NY
Print ISBN: 978-0-387-37269-3
Online ISBN: 978-0-387-48945-2
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)