Probability and Information Theory
This chapter serves as an introduction to concepts from elementary probability theory and information theory in the concrete context of the real line and multi-dimensional Euclidean space. The probabilistic concepts of mean, variance, expected value, marginalization, conditioning, and conditional expectation are reviewed. In this part of the presentation there is some overlap with the previous chapter, which has some pedagogical benefit. There will be no mention of Borel measurability, σ-algebras, filtrations, or martingales, as these are treated in numerous other books on probability theory and stochastic processes such as [1, 14, 15, 32, 27, 48]. The presentation here, while drawing from these excellent works, will be restricted only to those topics that are required either in the mathematical and computational modeling of stochastic physical systems, or the determination of properties of solutions to the equations in these models. Basic concepts of information theory are addressed such as measures of distance, or “divergence,” between probability density functions, and the properties of “information” and entropy. All pdfs treated here will be differentiable functions on Rn. Therefore the entropy and information measures addressed in this chapter are those that are referred to in the literature as the “differential” or “continuous” version.
KeywordsProbability Density Function Central Limit Theorem Conditional Expectation Fisher Information Fisher Information Matrix
Unable to display preview. Download preview PDF.
- 3.Bertsekas, D., Convex Analysis and Optimization, Athena Scientific, 2003.Google Scholar
- 7.Brown, L.D., “A proof of the Central Limit Theorem motivated by the Cramér-Rao in-equality,” in G. Kallianpur, P.R. Krishnaiah, and J.K. Ghosh, eds., Statistics and Probability: Essays in Honour of C.R. Rao, pp. 141–148, North-Holland, New York, 1982.Google Scholar
- 37.Pennec, X., “Probabilities and statistics on Riemannian manifolds: Basic tools for geometric measurements,” IEEE Workshop on Nonlinear Signal and Image Processing, 1999.Google Scholar
- 44.Smith, S.T., “Covariance, subspace, and intrinsic Cramér-Rao bounds in signal processing,” IEEE Trans. Acoustics Speech Signal Process., 53, pp. 1610–1630, 2005.Google Scholar