Abstract
We discuss some of the recent literature on relations between information- and estimation-theoretic quantities. We begin by exploring the connections between mutual information and causal/non-causal, matched/mismatched estimation for the setting of a continuous-time source corrupted by white Gaussian noise. Relations involving causal estimation, in both matched and mismatched cases, and mutual information persist in the presence of feedback. We present a new unified framework, based on Girsanov theory and Itô’s Calculus, to derive these relations. We conclude by deriving some new results using this framework.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Define the filtration \(\mathcal{F}^{Y}_{t} = \sigma\{ Y(B) : B \subseteq\{s: s < t\} \}\). Note that in the setting of Theorem 5.5, the encoder ϕ t is measurable w.r.t. the σ-algebra \(\mathcal{F}^{X}_{T} \vee\mathcal{F}^{Y}_{t}\), and the estimate \(\hat{\phi}_{t}\) is measurable w.r.t. (or adapted to the filtration) \(\mathcal{F}^{Y}_{t}\).
References
Atar, R., Weissman, T.: Mutual information, relative entropy, and estimation in the Poisson channel. IEEE Trans. Inf. Theory 58(3), 1302–1318 (2012)
Barron, A.R.: Entropy and the central limit theorem. Ann. Probab. 14(1), 336–342 (1986)
Brown, L., Dasgupta, A., Haff, L.R., Strawderman, W.E.: The heat equation and Stein’s identity: connections, applications. J. Stat. Plan. Inference 136(7), 2254–2278 (2006)
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (2006)
Duncan, T.E.: On the calculation of mutual information. SIAM J. Appl. Math. 19, 215–220 (1970)
Girsanov, I.V.: On transforming a certain class of stochastic processes by absolutely continuous substitution of measures. Theory Probab. Appl. 5, 285–301 (1960)
Guo, D., Shamai, S., Verdú, S.: Mutual information and minimum mean-square error in Gaussian channels. IEEE Trans. Inf. Theory IT-51(4), 1261–1283 (2005)
Guo, D., Wu, Y., Shamai (Shitz), S., Verdú, S.: Estimation in Gaussian noise: properties of the minimum mean-square error. IEEE Trans. Inf. Theory 57(4), 2371–2385 (2011)
Kadota, T.T., Zakai, M., Ziv, J.: Mutual information of the white. Gaussian channel with and without feedback. IEEE Trans. Inf. Theory IT-17(4), 368–371 (1971)
Karatzas, I., Shreve, A.E.: Brownian Motion and Stochastic Calculus, 2nd edn. Springer, New York (1988)
Merhav, N.: Data processing theorems and the second law of thermodynamics. IEEE Trans. Inf. Theory 57(8), 4926–4939 (2011)
No, A., Weissman, T.: Minimax filtering regret via relations between information and estimation. In: 2013 IEEE International Symposium on Information Theory Proceedings (ISIT), 7–12 July 2013, pp. 444–448 (2013)
Palomar, D., Verdú, S.: Representation of mutual information via input estimates. IEEE Trans. Inf. Theory 53(2), 453–470 (2007)
Palomar, D.P., Verdu, S.: Lautum information. IEEE Trans. Inf. Theory 54(3), 964–975 (2008)
Polyanskiy, Y., Poor, H.V., Verdú, S.: New channel coding achievability bounds. In: IEEE Int. Symposium on Information Theory 2008, Toronto, Ontario, Canada, 6–11 July 2008
Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 2(2), 101–112 (1959)
Steele, J.M.: Stochastic Calculus and Financial Applications. Springer, Berlin (2010)
Venkat, K., Weissman, T.: Pointwise relations between information and estimation in Gaussian noise. IEEE Trans. Inf. Theory 58(10), 6264–6281 (2012)
Verdú, S.: Mismatched estimation and relative entropy. IEEE Trans. Inf. Theory 56(8), 3712–3720 (2010)
Weissman, T.: The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels. IEEE Trans. Inf. Theory 56(9), 4256–4273 (2010)
Weissman, T., Kim, Y.-H., Permuter, H.H.: Directed information, causal estimation, and communication in continuous time. IEEE Trans. Inf. Theory 59(3), 1271–1287 (2012)
Wu, Y., Verdu, S.: Functional properties of MMSE and mutual information. IEEE Trans. Inf. Theory 58(3), 1289–1301 (2012)
Zakai, M.: On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel. IEEE Trans. Inf. Theory 51(9), 3017–3024 (2005)
Acknowledgement
This research was supported by LCCC—Linnaeus Grant VR 2007-8646, Swedish Research Council.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Asnani, H., Venkat, K., Weissman, T. (2014). Relations Between Information and Estimation in the Presence of Feedback. In: Como, G., Bernhardsson, B., Rantzer, A. (eds) Information and Control in Networks. Lecture Notes in Control and Information Sciences, vol 450. Springer, Cham. https://doi.org/10.1007/978-3-319-02150-8_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-02150-8_5
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-02149-2
Online ISBN: 978-3-319-02150-8
eBook Packages: EngineeringEngineering (R0)