Kolmogorov's two-series theorem

In probability theory, Kolmogorov's two-series theorem is a result about the convergence of random series. It follows from Kolmogorov's inequality and is used in one proof of the strong law of large numbers.

Statement of the theorem

Let be independent random variables with expected values and variances , such that converges in ℝ and converges in ℝ. Then converges in ℝ almost surely.

Proof

Assume WLOG . Set , and we will see that with probability 1.

For every ,

Thus, for every and ,

While the second inequality is due to Kolmogorov's inequality.

By the assumption that converges, it follows that the last term tends to 0 when , for every arbitrary .

gollark: "Reading takes away from the amount of time I could spend complaining about new things!"
gollark: That reminds me, I should hatch that mageia.
gollark: I'll probably only try and get about 5 (garland, wrapping-wing, new release, winter magi, wait that's four).
gollark: Those look nice, will pick some up.
gollark: Last year's, slightly.

References

    • Durrett, Rick. Probability: Theory and Examples. Duxbury advanced series, Third Edition, Thomson Brooks/Cole, 2005, Section 1.8, pp. 60–69.
    • M. Loève, Probability theory, Princeton Univ. Press (1963) pp. Sect. 16.3
    • W. Feller, An introduction to probability theory and its applications, 2, Wiley (1971) pp. Sect. IX.9
    This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.