User Tools

Site Tools


cm:information_theory

Information Theory is the quantitative study of information. One of the most important measures in information theory is entropy.

Given a probability distribution <latex>P = \{p_1, p_2, \dots\}</latex>, the entropy of that distribution, <latex>H(P)</latex> is given by:

<latex> H(P) = -\sum_i p_i log_2(p_i) </latex>

Note that the base of the logarithm is 2. This gives entropies in units of bits. For convenience, the base of a logarithm may be assumed to be 2, unless otherwise noted.

Entropy can be thought of as the average number of yes-no questions required to determine the particular value the distribution <latex> P </latex> takes.

Other measures of relevance include:

* joint entropy

* mutual information

* Kullback–Leibler divergence

cm/information_theory.txt · Last modified: 2010/01/12 18:08 by brbrown