Fine Art

.

In probability theory, Kolmogorov's inequality is a so-called "maximal inequality" that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The inequality is named after the Russian mathematician Andrey Kolmogorov

Statement of the inequality

Let \( X_1, ..., X_n \) : Ω → R be independent random variables defined on a common probability space (Ω, F, Pr), with expected value E[Xk] = 0 and variance Var[Xk] < +∞ for k = 1, ..., n. Then, for each λ > 0,

\( \Pr \left(\max_{1\leq k\leq n} | S_k |\geq\lambda\right)\leq \frac{1}{\lambda^2} \operatorname{Var} [S_n] \equiv \frac{1}{\lambda^2}\sum_{k=1}^n \operatorname{Var}[X_k], \)

where \( S_k = X_1 + ... + X_k. \)
Proof

The following argument is due to Kareem Amin and employs discrete martingales. As argued in the discussion of Doob's martingale inequality, the sequence \( S_1, S_2, \dots, S_n \) is a martingale. Without loss of generality, we can assume that \(S_0 = 0 \) and \( S_i \geq 0 \) for all i. Define \) (Z_i)_{i=0}^n \) as follows. Let \( Z_0 = 0 \) , and

\( Z_{i+1} = \left\{ \begin{array}{ll} S_{i+1} & \text{ if } \displaystyle \max_{1 \leq j \leq i} S_j < \lambda \\ Z_i & \text{ otherwise} \end{array} \right. \)

for all i. Then \( (Z_i)_{i=0}^n \) is a also a martingale. Since \( S_i-S_{i-1} \) is independent and mean zero,

\( \begin{align} \sum_{i=1}^n \text{E}[ (S_i - S_{i-1})^2] &= \sum_{i=1}^n \text{E}[ S_i^2 - 2 S_i S_{i-1} + S_{i-1}^2 ] \\ &= \sum_{i=1}^n \text{E}\left[ S_i^2 - 2 (S_{i-1} + S_{i} - S_{i-1}) S_{i-1} + S_{i-1}^2 \right] \\ &= \sum_{i=1}^n \text{E}\left[ S_i^2 - S_{i-1}^2 \right] - 2\text{E}\left[ S_{i-1} (S_{i}-S_{i-1})\right]\\ &= \text{E}[S_n^2] - \text{E}[S_0^2] = \text{E}[S_n^2]. \end{align} \)

The same is true for \( (Z_i)_{i=0}^n \) . Thus

\( \begin{align} \text{Pr}\left( \max_{1 \leq i \leq n} S_i \geq \lambda\right) &= \text{Pr}[Z_n \geq \lambda] \\ &\leq \frac{1}{\lambda^2} \text{E}[Z_n^2] =\frac{1}{\lambda^2} \sum_{i=1}^n \text{E}[(Z_i - Z_{i-1})^2] \\ &\leq \frac{1}{\lambda^2} \sum_{i=1}^n \text{E}[(S_i - S_{i-1})^2] =\frac{1}{\lambda^2} \text{E}[S_n^2] = \frac{1}{\lambda^2} \text{Var}[S_n]. \end{align} \)

by Chebyshev's inequality.
See also

Chebyshev's inequality
Doob's martingale inequality
Etemadi's inequality
Landau–Kolmogorov inequality
Markov's inequality
Bernstein inequalities (probability theory)

References

Billingsley, Patrick (1995). Probability and Measure. New York: John Wiley & Sons, Inc.. ISBN 0-471-00710-2. (Theorem 22.4)
Feller, William (1968) [1950]. An Introduction to Probability Theory and its Applications, Vol 1 (Third Edition ed.). New York: John Wiley & Sons, Inc.. xviii+509. ISBN 0-471-25708-7.

This article incorporates material from Kolmogorov's inequality on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.

Mathematics Encyclopedia

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License

Home - Hellenica World