Evidence lower bound

In statistics, the evidence lower bound (ELBO, also variational lower bound or negative variational free energy) is the quantity optimized in Variational Bayesian methods. These methods handle cases where a distribution over unobserved variables is optimized as an approximation to the true posterior , given observed data . Then the evidence lower bound is defined as [1]:

where is cross entropy. Maximizing the evidence lower bound minimizes , the Kullback–Leibler divergence, a measure of dissimilarity of from the true posterior. The primary reason why this quantity is preferred for optimization is that it can be computed without access to the posterior, given a good choice of .

For other measures of dissimilarity to be optimized to fit see Divergence (statistics)[2].

Justification as a lower bound on the evidence

The name evidence lower bound is justified by analyzing a decomposition of the KL-divergence between the true posterior and [3]:

As this equation shows that the evidence lower bound is indeed a lower bound on the log-evidence for the model considered. As does not depend on this equation additionally shows that maximizing the evidence lower bound on the right minimizes , as claimed above.

gollark: Except at releases.
gollark: It might. Possibly there's just some fixed value which isn't expected to be hit.
gollark: Also, is based on someone else's thing there.
gollark: It's not new, it's field-tested on the esolangs server.
gollark: 🇧 🇱 🇦 🇲 🇪 🇩 🇮 🇸 🇨 🇴 🇷 🇩

References

  1. Yang, Xitong. "Understanding the Variational Lower Bound" (PDF). Institute for Advanced Computer Studies. University of Maryland. Retrieved 20 March 2018.
  2. Minka, Thomas (2005), Divergence measures and message passing. (PDF)
  3. Bishop, Christopher M. (2006), "10.1 Variational Inference", Pattern Recognition and Machine Learning (PDF)
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.