Bhattacharyya distance

In statistics, the Bhattacharyya distance measures the similarity of two probability distributions. It is closely related to the Bhattacharyya coefficient which is a measure of the amount of overlap between two statistical samples or populations. Both measures are named after Anil Kumar Bhattacharya, a statistician who worked in the 1930s at the Indian Statistical Institute.[1]

The coefficient can be used to determine the relative closeness of the two samples being considered. It is used to measure the separability of classes in classification and it is considered to be more reliable than the Mahalanobis distance, as the Mahalanobis distance is a particular case of the Bhattacharyya distance when the standard deviations of the two classes are the same. Consequently, when two classes have similar means but different standard deviations, the Mahalanobis distance would tend to zero, whereas the Bhattacharyya distance grows depending on the difference between the standard deviations.

Definition

For probability distributions p and q over the same domain X, the Bhattacharyya distance is defined as

where

is the Bhattacharyya coefficient for discrete probability distributions.

For continuous probability distributions, the Bhattacharyya coefficient is defined as

In either case, and . does not obey the triangle inequality, but the Hellinger distance, which is given by does obey the triangle inequality.

A triparametric generalization[2] of the Bhattacharyya distance has been reported.

In its simplest formulation, the Bhattacharyya distance between two classes under the normal distribution can be calculated[3] by extracting the mean and variances of two separate distributions or classes:

where:

  is the Bhattacharyya distance between p and q distributions or classes,
  is the variance of the p-th distribution,
  is the mean of the p-th distribution, and
  are two different distributions.

The Mahalanobis distance used in Fisher's linear discriminant analysis is a particular case of the Bhattacharyya Distance.

For multivariate normal distributions ,

where and are the means and covariances of the distributions, and

Note that, in this case, the first term in the Bhattacharyya distance is related to the Mahalanobis distance.

Bhattacharyya coefficient

The Bhattacharyya coefficient is an approximate measurement of the amount of overlap between two statistical samples. The coefficient can be used to determine the relative closeness of the two samples being considered.

Calculating the Bhattacharyya coefficient involves a rudimentary form of integration of the overlap of the two samples. The interval of the values of the two samples is split into a chosen number of partitions, and the number of members of each sample in each partition is used in the following formula,

[4]

where, considering the samples p and q, n is the number of partitions, and , are the numbers of members of samples p and q in the i-th partition.

This formula hence is larger with each partition that has members from both sample, and larger with each partition that has a large overlap of the two sample's members within it. The choice of number of partitions depends on the number of members in each sample; too few partitions will lose accuracy by overestimating the overlap region, and too many partitions will lose accuracy by creating individual partitions with no members despite being in a densely populated sample space.

The Bhattacharyya coefficient will be 0 if there is no overlap at all due to the multiplication by zero in every partition. This means the distance between fully separated samples will not be exposed by this coefficient alone.

The Bhattacharyya coefficient is used in the construction of polar codes[5].

Applications

The Bhattacharyya distance is widely used in research of feature extraction and selection,[6] image processing,[7] speaker recognition,[8] phone clustering.[9]

A "Bhattacharyya space" has been proposed as a feature selection technique that can be applied to texture segmentation.[10]

gollark: I have no idea.
gollark: Oh, here's politics: https://www.bbc.co.uk/news/business-57893161
gollark: It would be very convenient if we could automatically generate summaries of the news, or something.
gollark: "former president Jacob Zuma was arrested for contempt of court", apparently.
gollark: Idea: task a recursively self-improving AI with maximizing the "coolness" of news events.

See also

References

  1. Bhattacharyya, A. (1943). "On a measure of divergence between two statistical populations defined by their probability distributions". Bulletin of the Calcutta Mathematical Society. 35: 99–109. MR 0010358.
  2. Frank Nielsen. A generalization of the Jensen divergence: The chord gap divergence. arxiv 2017 (ICASSP 2018). arXiv:1709.10498
  3. Guy B. Coleman, Harry C. Andrews, "Image Segmentation by Clustering", Proc IEEE, Vol. 67, No. 5, pp. 773–785, 1979
  4. D. Comaniciu, V. Ramesh, P. Meer, Real-Time Tracking of Non-Rigid Objects using Mean Shift Archived 2010-08-14 at the Wayback Machine, BEST PAPER AWARD, IEEE Conf. Computer Vision and Pattern Recognition (CVPR'00), Hilton Head Island, South Carolina, Vol. 2, 142–149, 2000
  5. Arıkan, Erdal (July 2009). "Channel polarization: A method for constructing capacity-achieving codes for symmetric binary-input memoryless channels". IEEE Transactions on Information Theory. 55 (7): 3051–3073. arXiv:0807.3917. doi:10.1109/TIT.2009.2021379.
  6. Euisun Choi, Chulhee Lee, "Feature extraction based on the Bhattacharyya distance", Pattern Recognition, Volume 36, Issue 8, August 2003, Pages 1703–1709
  7. François Goudail, Philippe Réfrégier, Guillaume Delyon, "Bhattacharyya distance as a contrast parameter for statistical processing of noisy optical images", JOSA A, Vol. 21, Issue 7, pp. 1231−1240 (2004)
  8. Chang Huai You, "An SVM Kernel With GMM-Supervector Based on the Bhattacharyya Distance for Speaker Recognition", Signal Processing Letters, IEEE, Vol 16, Is 1, pp. 49-52
  9. Mak, B., "Phone clustering using the Bhattacharyya distance", Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on, Vol 4, pp. 2005–2008 vol.4, 3−6 Oct 1996
  10. Reyes-Aldasoro, C.C., and A. Bhalerao, "The Bhattacharyya space for feature selection and its application to texture segmentation", Pattern Recognition, (2006) Vol. 39, Issue 5, May 2006, pp. 812–826
  • Nielsen, F.; Boltz, S. (2010). "The Burbea–Rao and Bhattacharyya centroids". IEEE Transactions on Information Theory. 57 (8): 5455–5466. arXiv:1004.5049. doi:10.1109/TIT.2011.2159046.
  • Bhattacharyya, A. (1943). "On a measure of divergence between two statistical populations defined by their probability distributions". Bulletin of the Calcutta Mathematical Society. 35: 99–109. MR 0010358.
  • Kailath, T. (1967). "The Divergence and Bhattacharyya Distance Measures in Signal Selection". IEEE Transactions on Communication Technology. 15 (1): 52–60. doi:10.1109/TCOM.1967.1089532.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.