Expectation propagation

Expectation propagation (EP) is a technique in Bayesian machine learning.[1]

EP finds approximations to a probability distribution.[1] It uses an iterative approach that leverages the factorization structure of the target distribution.[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.[1]

More specifically, suppose we wish to approximate an intractable probability distribution with a tractable distribution . Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence .[1] Variational Bayesian methods minimize instead.[1]

If is a Gaussian , then is minimized with and being equal to the mean of and the covariance of , respectively; this is called moment matching.[1]

Applications

Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

gollark: Can Milo be modified to not need introspection modules?
gollark: Excellent.
gollark: Does 1.16.5 CC have item movement features built in? There is no Plethora.
gollark: DRM = digital rights management. It appears that you are going down the dark path of trying to run software on users' computers without them being able to alter/observe it.
gollark: I WILL devote moderate effort to breaking such a system.

References

  1. Bishop, Christopher (2007). Pattern Recognition and Machine Learning. New York: Springer-Verlag New York Inc. ISBN 978-0387310732.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.