Abstract
Let P be a Markov kernel on \(\mathsf {X}\times \mathscr {X}\) that admits an invariant probability measure \(\pi \) and let be the canonical Markov chain.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsAuthor information
Authors and Affiliations
Corresponding author
21.A A Covariance Inequality
21.A A Covariance Inequality
Lemma 21.A.1
Let \((\varOmega ,\mathscr {A},\mathbb {P})\) be a probability space and X, Y two square-integrable random variables defined on \((\varOmega ,\mathscr {A},\mathbb {P})\). Define
Then
Proof
For X, Y two square-integrable random variables defined on a probability space \((\varOmega ,\mathscr {A},\mathbb {P})\), one has
Note indeed that every random variable X can be written as
Writing Y similarly and applying Fubini’s theorem yields (21.A.2). For \(x\in \mathbb {R}\), set . Since the functions \(I_x\) are uniformly bounded by 1, we obtain
On the other hand, using that \({\mathbb E}\left[ |I_x(X)| \right] = \mathbb {P}(|X|>x)\), we get
Plugging these bounds into (21.A.2), we obtain
The proof is concluded by applying Hölder’s inequality. \({\Box }\)
Lemma 21.A.2
Let \((\varOmega ,\mathscr {A},\mathbb {P})\) be a probability space. Let X be a real-valued random variable and V a uniformly distributed random variable independent of X defined on \((\varOmega ,\mathscr {A},\mathbb {P})\). Define \(F_X(x^-)=\lim _{y\rightarrow x \atop y<x} F_X(y)\), \(\varDelta F_X(x) = F_X(x) -F_X(x^-)\), where \(F_X\) is the cumulative distribution function and
Then U is uniformly distributed and \(Q_X(U)=X\) , where \(Q_X\) is the tail quantile function.
Proof
That \(Q_X(U)=X\) is straightforward, since by definition, \(Q_X(v)=x\) for all \(v\in \left[ 1-F_X(x^-), 1-F_X(x)\right] \), whether there is a jump at x or not. To check that U is uniformly distributed over \(\left[ 0,1\right] \), note that \(\mathbb {P}(X>x)>u\) if and only if \(Q_X(u)>x\). Since V is uniformly distributed on [0, 1], this yields
\({\Box }\)
Lemma 21.A.3
Let \((\varOmega ,\mathscr {A},\mathbb {P})\) be a probability space and \(\mathscr {B}\) a sub-\(\sigma \)-algebra of \(\mathscr {A}\). Let X be a square-integrable random variable and . Then for all \(a\in \left[ 0,1\right] \),
Proof
By Lemma 21.A.2, let V be a uniformly distributed random variable, independent of \(\mathscr {B}\) and X, and define \(U = 1-F_Y(Y^-) - V \{F_Y(Y)-F_Y(Y^-)\}\). Set \(\mathscr {G}=\mathscr {B}\vee \sigma (V)\). Then \(Q_Y(U) = Y\) is \(\mathscr {B}\)-measurable and Applying Jensen’s inequality, we obtain
Noting that \(\mathbb {P}(X^2>x) > u\) if and only if \(Q_X^2(u) > x\) and applying Fubini’s theorem, we obtain
\({\Box }\)
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Douc, R., Moulines, E., Priouret, P., Soulier, P. (2018). Central Limit Theorems. In: Markov Chains. Springer Series in Operations Research and Financial Engineering. Springer, Cham. https://doi.org/10.1007/978-3-319-97704-1_21
Download citation
DOI: https://doi.org/10.1007/978-3-319-97704-1_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97703-4
Online ISBN: 978-3-319-97704-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)