Jensen–Shannon divergence

In probability theory and statistics, the JensenShannon divergence is a method of measuring the similarity between two probability distributions. It is also known as information radius (IRad)[1] or total divergence to the average.[2] It is based on the Kullback–Leibler divergence, with some notable (and useful) differences, including that it is symmetric and it always has a finite value. The square root of the Jensen–Shannon divergence is a metric often referred to as Jensen-Shannon distance.[3][4][5]

Definition

Consider the set of probability distributions where A is a set provided with some σ-algebra of measurable subsets. In particular we can take A to be a finite or countable set with all subsets being measurable.

The Jensen–Shannon divergence (JSD) is a symmetrized and smoothed version of the Kullback–Leibler divergence . It is defined by

where

A generalization of the Jensen–Shannon divergence using abstract means (like the geometric or harmonic means) instead of the arithmetic mean was recently proposed.[6] The geometric Jensen–Shannon divergence (or G-Jensen–Shannon divergence) yields a closed-form formula for divergence between two Gaussian distributions by taking the geometric mean.

A more general definition, allowing for the comparison of more than two probability distributions, is:

where are weights that are selected for the probability distributions and is the Shannon entropy for distribution . For the two-distribution case described above,

Bounds

The Jensen–Shannon divergence is bounded by 1 for two probability distributions, given that one uses the base 2 logarithm.[7]

With this normalization, it is a lower bound on the total variation distance between P and Q:

For log base e, or ln, which is commonly used in statistical thermodynamics, the upper bound is ln(2):

A more general bound, the Jensen–Shannon divergence is bounded by for more than two probability distributions, given that one uses the base 2 logarithm.[7]

Relation to mutual information

The Jensen–Shannon divergence is the mutual information between a random variable associated to a mixture distribution between and and the binary indicator variable that is used to switch between and to produce the mixture. Let be some abstract function on the underlying set of events that discriminates well between events, and choose the value of according to if and according to if , where is equiprobable. That is, we are choosing according to the probability measure , and its distribution is the mixture distribution. We compute

It follows from the above result that the Jensen–Shannon divergence is bounded by 0 and 1 because mutual information is non-negative and bounded by . The JSD is not always bounded by 0 and 1: the upper limit of 1 arises here because we are considering the specific case involving the binary variable .

One can apply the same principle to a joint distribution and the product of its two marginal distribution (in analogy to Kullback–Leibler divergence and mutual information) and to measure how reliably one can decide if a given response comes from the joint distribution or the product distribution—subject to the assumption that these are the only two possibilities.[8]

Quantum Jensen–Shannon divergence

The generalization of probability distributions on density matrices allows to define quantum Jensen–Shannon divergence (QJSD).[9][10] It is defined for a set of density matrices and a probability distribution as

where is the von Neumann entropy of . This quantity was introduced in quantum information theory, where it is called the Holevo information: it gives the upper bound for amount of classical information encoded by the quantum states under the prior distribution (see Holevo's theorem).[11] Quantum Jensen–Shannon divergence for and two density matrices is a symmetric function, everywhere defined, bounded and equal to zero only if two density matrices are the same. It is a square of a metric for pure states[12], and it was recently shown that this metric property holds for mixed states as well.[13][14] The Bures metric is closely related to the quantum JS divergence; it is the quantum analog of the Fisher information metric.

Generalization

Nielsen introduced the skew K-divergence:[15] It follows a one-parametric family of Jensen–Shannon divergences, called the -Jensen–Shannon divergences: which includes the Jensen–Shannon divergence (for ) and the half of the Jeffreys divergence (for ).

Applications

The Jensen–Shannon divergence has been applied in bioinformatics and genome comparison,[16][17] in protein surface comparison,[18] in the social sciences,[19] in the quantitative study of history,[20] and in machine learning.[21]

Notes

  1. Hinrich Schütze; Christopher D. Manning (1999). Foundations of Statistical Natural Language Processing. Cambridge, Mass: MIT Press. p. 304. ISBN 978-0-262-13360-9.
  2. Dagan, Ido; Lillian Lee; Fernando Pereira (1997). "Similarity-Based Methods For Word Sense Disambiguation". Proceedings of the Thirty-Fifth Annual Meeting of the Association for Computational Linguistics and Eighth Conference of the European Chapter of the Association for Computational Linguistics: 56–63. arXiv:cmp-lg/9708010. Bibcode:1997cmp.lg....8010D. doi:10.3115/979617.979625. Retrieved 2008-03-09.
  3. Endres, D. M.; J. E. Schindelin (2003). "A new metric for probability distributions" (PDF). IEEE Trans. Inf. Theory. 49 (7): 1858–1860. doi:10.1109/TIT.2003.813506.
  4. Ôsterreicher, F.; I. Vajda (2003). "A new class of metric divergences on probability spaces and its statistical applications". Ann. Inst. Statist. Math. 55 (3): 639–653. doi:10.1007/BF02517812.
  5. Fuglede, B.; Topsoe, F. (2004). "Jensen-Shannon divergence and Hilbert space embedding" (PDF). Proceedings of the International Symposium on Information Theory, 2004. IEEE. p. 30. doi:10.1109/ISIT.2004.1365067. ISBN 978-0-7803-8280-0.
  6. Nielsen, Frank (2019). "On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means". arXiv:1904.04017 [cs.IT].
  7. Lin, J. (1991). "Divergence measures based on the shannon entropy" (PDF). IEEE Transactions on Information Theory. 37 (1): 145–151. CiteSeerX 10.1.1.127.9167. doi:10.1109/18.61115.
  8. Schneidman, Elad; Bialek, W; Berry, M.J. 2nd (2003). "Synergy, Redundancy, and Independence in Population Codes". Journal of Neuroscience. 23 (37): 11539–11553. doi:10.1523/JNEUROSCI.23-37-11539.2003. PMID 14684857.
  9. Majtey, A.; Lamberti, P.; Prato, D. (2005). "Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states". Physical Review A. 72 (5): 052310. arXiv:quant-ph/0508138. Bibcode:2005PhRvA..72e2310M. doi:10.1103/PhysRevA.72.052310.
  10. Briët, Jop; Harremoës, Peter (2009). "Properties of classical and quantum Jensen-Shannon divergence". Physical Review A. 79 (5): 052311. arXiv:0806.4472. Bibcode:2009PhRvA..79e2311B. doi:10.1103/PhysRevA.79.052311.
  11. Holevo, A. S. (1973), "Bounds for the quantity of information transmitted by a quantum communication channel", Problemy Peredachi Informatsii (in Russian), 9: 3–11. English translation: Probl. Inf. Transm., 9: 177–183 (1975) MR456936
  12. Braunstein, Samuel; Caves, Carlton (1994). "Statistical distance and the geometry of quantum states". Physical Review Letters. 72 (22): 3439–3443. Bibcode:1994PhRvL..72.3439B. doi:10.1103/PhysRevLett.72.3439. PMID 10056200.
  13. Virosztek, Dániel (2019). "The metric property of the quantum Jensen-Shannon divergence". arXiv:1910.10447.
  14. Sra, Suvrit (2019). "Metrics Induced by Quantum Jensen-Shannon-Renyí and Related Divergences". arXiv:1911.02643.
  15. Nielsen, Frank (2010). "A family of statistical symmetric divergences based on Jensen's inequality". arXiv:1009.4004 [cs.CV].
  16. Sims, GE; Jun, SR; Wu, GA; Kim, SH (2009). "Alignment-free genome comparison with feature frequency profiles (FFP) and optimal resolutions". Proceedings of the National Academy of Sciences of the United States of America. 106 (8): 2677–82. Bibcode:2009PNAS..106.2677S. doi:10.1073/pnas.0813249106. PMC 2634796. PMID 19188606.
  17. Itzkovitz, S; Hodis, E; Segal, E (2010). "Overlapping codes within protein-coding sequences". Genome Research. 20 (11): 1582–9. doi:10.1101/gr.105072.110. PMC 2963821. PMID 20841429.
  18. Ofran, Y; Rost, B (2003). "Analysing six types of protein-protein interfaces". Journal of Molecular Biology. 325 (2): 377–87. CiteSeerX 10.1.1.6.9207. doi:10.1016/s0022-2836(02)01223-8. PMID 12488102.
  19. DeDeo, Simon; Hawkins, Robert X. D.; Klingenstein, Sara; Hitchcock, Tim (2013). "Bootstrap Methods for the Empirical Study of Decision-Making and Information Flows in Social Systems". Entropy. 15 (6): 2246–2276. arXiv:1302.0907. Bibcode:2013Entrp..15.2246D. doi:10.3390/e15062246.
  20. Klingenstein, Sara; Hitchcock, Tim; DeDeo, Simon (2014). "The civilizing process in London's Old Bailey". Proceedings of the National Academy of Sciences. 111 (26): 9419–9424. Bibcode:2014PNAS..111.9419K. doi:10.1073/pnas.1405984111. PMC 4084475. PMID 24979792.
  21. Goodfellow, Ian J.; Pouget-Abadie, Jean; Mirza, Mehdi; Xu, Bing; Warde-Farley, David; Ozair, Sherjil; Courville, Aaron; Bengio, Yoshua (2014). Generative Adversarial Networks. NIPS. arXiv:1406.2661. Bibcode:2014arXiv1406.2661G.

Further reading

  • Frank Nielsen (2010). "A family of statistical symmetric divergences based on Jensen's inequality". arXiv:1009.4004 [cs.CV].
gollark: It's the bit which provides all their weird proprietary APIs.
gollark: Google Play *Services*, not the store.
gollark: OnePlus seems to have been steadily raising their prices, in accordance with the general phone market.
gollark: I don't know what you mean by "smooth" and I don't see how excessive vertical integration helps.
gollark: Mine's okay now I've turned the animations off.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.