Markov information source

In mathematics, a Markov information source, or simply, a Markov source, is an information source whose underlying dynamics are given by a stationary finite Markov chain.

Formal definition

An information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.

A Markov information source is then a (stationary) Markov chain M, together with a function

that maps states S in the Markov chain to letters in the alphabet Γ.

A unifilar Markov source is a Markov source for which the values are distinct whenever each of the states are reachable, in one step, from a common prior state. Unifilar sources are notable in that many of their properties are far more easily analyzed, as compared to the general case. 00

Applications

Markov sources are commonly used in communication theory, as a model of a transmitter. Markov sources also occur in natural language processing, where they are used to represent hidden meaning in a text. Given the output of a Markov source, whose underlying Markov chain is unknown, the task of solving for the underlying chain is undertaken by the techniques of hidden Markov models, such as the Viterbi algorithm.

gollark: LASER BEES ACTIVATED
gollark: ÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆÆA
gollark: What? Ew. No.
gollark: Maybe I could use a map of cubical/axial coordinates to cell content instead of the array.
gollark: That's how it is rendered, yes.

See also

References

  • Robert B. Ash, Information Theory, (1965) Dover Publications. ISBN 0-486-66521-6
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.