Abstract
In Chapter 12, we introduced the regions Γ* n and Γ n in the entropy space H n for n random variables. From Γ* n , one in principle can determine whether any information inequality always holds. The region Γ n , defined by the set of all basic inequalities (equivalently all elemental inequalities) involving n random variables, is an outer bound on Γ* n . From Γ n , one can determine whether any information inequality is implied by the basic inequalities. If so, it is called a Shannon-type inequality. Since the basic inequalities always hold, so do all Shannon-type inequalities. In the last chapter, we have shown how machineproving of all Shannon-type inequalities can be made possible by taking advantage of the linear structure of Γ n .
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer Science+Business Media New York
About this chapter
Cite this chapter
Yeung, R.W. (2002). Beyond Shannon-Type Inequalities. In: A First Course in Information Theory. Information Technology: Transmission, Processing and Storage. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-8608-5_14
Download citation
DOI: https://doi.org/10.1007/978-1-4419-8608-5_14
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-4645-6
Online ISBN: 978-1-4419-8608-5
eBook Packages: Springer Book Archive