Expectation propagation

Expectation propagation (EP) is a technique in Bayesian machine learning.[1]

EP finds approximations to a probability distribution.[1] It uses an iterative approach that leverages the factorization structure of the target distribution.[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.[1]

More specifically, suppose we wish to approximate an intractable probability distribution with a tractable distribution . Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence .[1] Variational Bayesian methods minimize instead.[1]

If is a Gaussian , then is minimized with and being equal to the mean of and the covariance of , respectively; this is called moment matching.[1]

Applications

Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

gollark: ... why is this continuously printing `12`
gollark: Unrelatedly, OH BEE HORRIBLE CONCURRENCY BUGS.
gollark: I've always had inconveniently small disks, so I curate data quite strongly.
gollark: I wonder how many hard to debug deadlock bugs this thing is going to generate.
gollark: (accessible at a.osmarks.net)

References

  1. Bishop, Christopher (2007). Pattern Recognition and Machine Learning. New York: Springer-Verlag New York Inc. ISBN 978-0387310732.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.