Namespaces
Variants
Actions

Kullback-Leibler information

From Encyclopedia of Mathematics
Jump to: navigation, search


Kullback–Leibler quantity of information, Kullback–Leibler information quantity, directed divergence

For discrete distributions (cf. Discrete distribution) given by probability vectors $ p = ( p _ {1} \dots p _ {n} ) $, $ q = ( q _ {1} \dots q _ {n} ) $, the Kullback–Leibler (quantity of) information of $ p $ with respect to $ q $ is:

$$ I ( p;q ) = \sum _ {i = 1 } ^ { m } p _ {i} ( { \mathop{\rm log} } p _ {i} - { \mathop{\rm log} } q _ {i} ) , $$

where $ { \mathop{\rm log} } $ is the natural logarithm (cf. also Logarithm of a number).

More generally, one has:

$$ I ( P;Q ) = \int\limits _ \Omega { { \mathop{\rm log} } { \frac{p ( \omega ) }{q ( \omega ) } } } {P ( d \omega ) } $$

for probability distributions $ P ( d \omega ) $ and $ Q ( d \omega ) $ with densities $ p ( \omega ) $ and $ q ( \omega ) $( cf. Density of a probability distribution).

The negative of $ I ( P;Q ) $ is the conditional entropy (or relative entropy) of $ P ( d \omega ) $ with respect to $ Q ( d \omega ) $; see Entropy.

Various notions of (asymmetric and symmetric) information distances are based on the Kullback–Leibler information.

The quantity $ I ( p,q ) $ is also called the informational divergence (see Huffman code).

See also Information distance; Kullback–Leibler-type distance measures.

References

[a1] S. Kullback, "Information theory and statistics" , Wiley (1959)
[a2] S. Kullback, R.A. Leibler, "On information and sufficiency" Ann. Math. Stat. , 22 (1951) pp. 79–86
[a3] J. Sakamoto, M. Ishiguro, G. Kitagawa, "Akaike information criterion statistics" , Reidel (1986)
How to Cite This Entry:
Kullback–Leibler information. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Kullback%E2%80%93Leibler_information&oldid=22682