# Gaussian Signals, Correlation Matrices, and Sample Path Properties

## Abstract

In general, determination of the shape of the sample paths of a random signal X(t) requires knowledge of n-point probabilities P(a 1 < X(t 1) < b 1,. . ., a a < X(t n) < b n) for an arbitrary n, and arbitrary windows a 1 < b 1, . . ., a n < b n. But usually this information cannot be recovered if the only signal characteristic known is the autocorrelation function. The latter depends on the two-point distributions but does not uniquely determine them. However, in the case of Gaussian signals, the autocorrelations determine not only the two-point probability distributions but also all the n-point probability distributions, so that complete information is available within the second-order theory. For example, this means that you only have to estimate means and covariances to make the model. Also, in the Gaussian universe, weak stationarity implies strict stationarity as defined in Chapter 4. For the sake of simplicity all signals in this chapter are assumed to be real-valued. The chapter ends with a more subtle analysis of sample path properties of stationary signals such as continuity and differentiability; in the Gaussian case the information is particularly complete.

## Keywords

Autocorrelation Function Sample Path Correlation Matrice Random Quantity Random Signal
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## References

1. 33.
See, e.g., M. Denker and W. A. Woyczyński’s book mentioned in previous chapters.Google Scholar
2. 36.
This argument relies on the so-called Cauchy criterion of convergence for random quantities with finite variance: A sequence X n converges in the mean-square as n → ∞, that is, there exists a random quantity X such that limn→∞ E(X nX)2 = 0, if and only if limn→∞ limm→∞ E(X nX m)2 = 0. This criterion permits the verification of the convergence without knowing what the limit is; see, e.g., Theorem 11.4.2 in W. Rudin, Principles of Mathematical Analysis, McGraw-Hill, New York, 1976.
3. 37.
For details, see M. Loeve, Probability Theory, Van Nostrand, Princeton, NJ, 1963, Section 34.3.
4. 38.
For a more complete discussion of this theorem and its consequences for sample path continuity and differentiability of random signals, see, for example, M. Loève, Probability Theory, Van Nostrand, Princeton, NJ, 1963, Section 35.3.