Cramér's theorem (large deviations)

Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables. A weak version of this result was first shown by Harald Cramér in 1938.

Statement

The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as:

Let be a sequence of iid real random variables with finite logarithmic moment generating function, e.g. for all .

Then the Legendre transform of :

satisfies,

for all

In the terminology of the theory of large deviations the result can be reformulated as follows:

If is a series of iid random variables, then the distributions satisfy a large deviation principle with rate function .

gollark: It's not as if only 255 characters exist.
gollark: I think the 1 in 31 thing puts them at probably... uncommons, generally?
gollark: You'd expect it would be 1 in 26 if you don't think about it much, but it's not.
gollark: As they say, only 1 in 31 eggs are Z-coded!
gollark: I'm aiming for gold before the end of the year.

References

  • Klenke, Achim (2008). Probability Theory. Berlin: Springer. pp. 508. doi:10.1007/978-1-84800-048-3. ISBN 978-1-84800-047-6.
  • "Cramér theorem", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.