Information entropy

From Example Problems
Jump to navigation Jump to search
File:Binary entropy plot.png
Entropy of a Bernoulli trial as a function of success probability.

Entropy is a concept in thermodynamics (see thermodynamic entropy), statistical mechanics and information theory. The concepts of information and entropy have deep links with one another, although it took many years for the development of the theories of statistical mechanics and information theory to make this apparent. This article is about information entropy, the information-theoretic formulation of entropy.

Basic concept

The basic concept of entropy in information theory has to do with how much randomness there is in a signal or random event. An alternative way to look at this is to talk about how much information is carried by the signal.

As an example consider some English text, encoded as a string of letters, spaces and punctuation (so our signal is a string of characters). Since some characters are not very likely (e.g. 'z') while others are very common (e.g. 'e') the string of characters is not really as random as it might be. On the other hand, since we cannot predict what the next character will be, it does have some 'randomness'. Entropy is a measure of this randomness, suggested by Claude E. Shannon in his 1948 paper A Mathematical Theory of Communication.

Shannon offers a definition of entropy which satisfies the assumptions that:

  • The measure should be proportional (continuous) - i.e. changing the value of one of the probabilities by a very small amount should only change the entropy by a small amount.
  • If all the outcomes (letters in the example above) are equally likely then increasing the number of letters should always increase the entropy.
  • We should be able to make the choice (in our example of a letter) in two steps, in which case the entropy of the final result should be a weighted sum of the entropies of the two steps.

(Note: Shannon/Weaver make reference to Tolman (1938) who in turn credits Pauli (1933) with the definition of entropy that is used by Shannon. Elsewhere in statistical mechanics, the literature includes references to von Neumann as having derived the same form of entropy in 1927, so it was that von Neumann favoured the use of the existing term 'entropy'. )

Formal definitions

Claude E. Shannon defines entropy in terms of a discrete random event x, with possible states 1..n as:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H(x)=\sum_{i=1}^np(i)\log_2 \left(\frac{1}{p(i)}\right)=-\sum_{i=1}^np(i)\log_2 p(i).\,\!}

That is, the entropy of the event x is the sum, over all possible outcomes i of x, of the product of the probability of outcome i times the log of the probability of i (which is also called s's surprisal - the entropy of x is the expected value of its outcome's surprisal). We can also apply this to a general probability distribution, rather than a discrete-valued event.

Shannon shows that any definition of entropy satisfying his assumptions will be of the form:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle -K\sum_{i=1}^np(i)\log p(i).\,\!}

where K is a constant (and is really just a choice of measurement units).

Shannon defined a measure of entropy (H = − p1 log2 p1 − … − pn log2 pn) that, when applied to an information source, could determine the minimum channel capacity required to reliably transmit the source as encoded binary digits. The formula can be derived by calculating the mathematical expectation of the amount of information contained in a digit from the information source. Shannon's entropy measure came to be taken as a measure of the uncertainty about the realization of a random variable. It thus served as a proxy capturing the concept of information contained in a message as opposed to the portion of the message that is strictly determined (hence predictable) by inherent structures. For example, redundancy in language structure or statistical properties relating to the occurrence frequencies of letter or word pairs, triplets etc. See Markov chain.

Shannon's definition of entropy is closely related to thermodynamic entropy as defined by physicists and many chemists. Boltzmann and Gibbs did considerable work on statistical thermodynamics, which became the inspiration for adopting the word entropy in information theory. There are relationships between thermodynamic and informational entropy. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. (See article: MaxEnt thermodynamics). Similarly, Maxwell's demon reverses thermodynamic entropy with information; but if it is itself bound by the laws of thermodynamics, getting rid of that information exactly balances out the thermodynamic gain the demon would otherwise achieve.

It is important to remember that entropy is a quantity defined in the context of a probabilistic model for a data source. Independent fair coin flips have an entropy of 1 bit per flip. A source that always generates a long string of A's has an entropy of 0, since the next character will always be an 'A'.

The entropy rate of a data source means the average number of bits per symbol needed to encode it. Empirically, it seems that entropy of English text is between 1.1 and 1.6 bits per character, though clearly that will vary from text source to text source. Experiments with human predictors show an information rate of 1.1 or 1.6 bits per character, depending on the experimental setup; the PPM compression algorithm can achieve a compression ratio of 1.5 bits per character.

From the preceding example, note the following points:

  1. The amount of entropy is not always an integer number of bits.
  2. Many data bits may not convey information. For example, data structures often store information redundantly, or have identical sections regardless of the information in the data structure.

Entropy effectively bounds the performance of the strongest lossless (or nearly lossless) compression possible, which can be realized in theory by using the typical set or in practice using Huffman, Lempel-Ziv or arithmetic coding. The performance of existing data compression algorithms is often used as a rough estimate of the entropy of a block of data.

A common way to define entropy for text is based on the Markov model of text. For an order-0 source (each character is selected independent of the last characters), the binary entropy is:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H(\mathcal{S}) = - \sum p_i \log_2 p_i, \,\!}

where pi is the probability of i. For a first-order Markov source (one in which probabilities are dependent on the immediately preceding character but not on older history except through the immediately preceding character), the entropy rate is:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H(\mathcal{S}) = - \sum_i p_i \sum_j \ p_i (j) \log_2 p_i (j), \,\!}

where i is a state (certain preceding characters) and Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle p_i(j)} is the probability of Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle j} given Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle i} as the previous character (s).

For a second order Markov source, the entropy rate is

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H(\mathcal{S}) = -\sum_i p_i \sum_j p_i(j) \sum_k p_{i,j}(k)\ \log \ p_{i,j}(k). \,\!}

In general the b-ary entropy of a source Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \mathcal{S}} = (S,P) with source alphabet S = {a1, …, an} and discrete probability distribution P = {p1, …, pn} where pi is the probability of ai (say pi = p(ai)) is defined by:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H_b(\mathcal{S}) = - \sum_{i=1}^n p_i \log_b p_i \,\!}

Note: the b in "b-ary entropy" is the number of different symbols of the "ideal alphabet" which is being used as the standard yardstick to measure source alphabets. In information theory, two symbols are necessary and sufficient for an alphabet to be able to encode information, therefore the default is to let b = 2 ("binary entropy"). Thus, the entropy of the source alphabet, with its given empiric probability distribution, is a number equal to the number (possibly fractional) of symbols of the "ideal alphabet", with an optimal probability distribution, necessary to encode for each symbol of the source alphabet. Also note that "optimal probability distribution" here means a uniform distribution: a source alphabet with n symbols has the highest possible entropy (for an alphabet with n symbols) when the probability distribution of the alphabet is uniform. This optimal entropy turns out to be Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle log_b \, n } .

Another way to define the entropy function H (not using the Markov model) is by proving that H is uniquely defined (as earlier mentioned) iff H satisfies 1) - 3):

1) H(p1, …, pn) is defined and continuous for all p1, …, pn where pi Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \in} [0,1] for all i = 1, …, n and p1 + … + pn = 1. (Remark that the function solely depends on the probability distribution, not the alphabet.)

2) For all positive integers n, H satisfies

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H\underbrace{\left(\frac{1}{n}, \ldots, \frac{1}{n}\right)}_{n\ \mathrm{arguments}} < H\underbrace{\left(\frac{1}{n+1}, \ldots, \frac{1}{n+1}\right).}_{n+1\ \mathrm{arguments}} }

3) For positive integers bi where b1 + … + bn = n, H satisfies

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H\underbrace{\left(\frac{1}{n}, \ldots, \frac{1}{n}\right)}_n = H\underbrace{\left(\frac{b_1}{n}, \ldots, \frac{b_n}{n}\right)}_n + \sum_{i=1}^n \frac{b_i}{n} H\underbrace{\left(\frac{1}{b_i}, \ldots, \frac{1}{b_i}\right)}_{b_i}. }

Efficiency

A source alphabet encountered in practice should be found to have a probability distribution which is less than optimal. If the source alphabet has n symbols, then it can be compared to an "optimized alphabet" with n symbols, whose probability distribution is uniform. The ratio of the entropy of the source alphabet with the entropy of its optimized version is the efficiency of the source alphabet, which can be expressed as a percentage.

This implies that the efficiency of a source alphabet with n symbols can be defined simply as being equal to its n-ary entropy.

Derivation of Shannon's entropy

Since the entropy was given as a definition, it does not need to be derived. On the other hand, a "derivation" can be given which gives a sense of the motivation for the definition as well as the link to thermodynamic entropy.

Q. Given a roulette with n pockets which are all equally likely to be landed on by the ball, what is the probability of obtaining a distribution (A1, A2, …, An) where Ai is the number of times pocket i was landed on and

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle P = \sum_{i=1}^n A_i \,\!}

is the total number of ball-landing events?

A. The probability is a multinomial distribution, viz.

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle p = {\Omega \over \Tau} = {P! \over A_1! \ A_2! \ A_3! \ \dots \ A_n!} \left(\frac1n\right)^P \,\!}

where

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \Omega = {P! \over A_1! \ A_2! \ A_3! \ \dots \ A_n!} \,\!}

is the number of possible combinations of outcomes (for the events) which fit the given distribution, and

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \Tau = n^P \ }

is the number of all possible combinations of outcomes for the set of P events.

Q. And what is the entropy?

A. The entropy of the distribution is obtained from the logarithm of Ω:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H = \log \Omega = \log \frac{P!}{A_1! \ A_2! \ A_3! \dots \ A_n!} \,\!}
Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle = \log P! - \log A_1! - \log A_2! - \log A_3! - \dots - \log A_n! \ }
Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle = \sum_i^P \log i - \sum_i^{A_1} \log i - \sum_i^{A_2} \log i - \dots - \sum_i^{A_n} \log i \,\!}

The summations can be approximated closely by being replaced with integrals:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H = \int_1^P \log x \, dx - \int_1^{A_1} \log x \, dx - \int_1^{A_2} \log x \, dx - \dots - \int_1^{A_n} \log x \, dx. \,\!}

The integral of the logarithm is

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \int \log x \, dx = x \log x - \int x \, {dx \over x} = x \log x - x. \,\!}

So the entropy is

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H = (P \log P - P + 1) - (A_1 \log A_1 - A_1 + 1) - (A_2 \log A_2 - A_2 + 1) - \dots - (A_n \log A_n - A_n + 1) }
Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle = (P \log P + 1) - (A_1 \log A_1 + 1) - (A_2 \log A_2 + 1) - \dots - (A_n \log A_n + 1) }
Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle = P \log P - \sum_{x=1}^n A_x \log A_x + (1 - n) \,\!}

Change Ax to px = Ax/P and change P to 1 (in order to measure the "bias" or "unevenness", in the probability distribution of the pockets for a single event), then

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H = (1 - n) - \sum_{x=1}^n p_x \log p_x \,\!}

and the term (1 − n) can be dropped since it is a constant, independent of the px distribution. The result is

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H = - \sum_{x=1}^n p_x \log p_x \,\!} .

Thus, the Shannon entropy is a consequence of the equation

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle H = \log \Omega \ }

which relates to Boltzmann's definition,

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \mathcal{S} = K \ln \Omega } ,

of thermodynamic entropy.

See also

External links

de:Entropie (Informationstheorie) hu:Shannon-entrópiafüggvény it:Entropia (teoria dell'informazione) nl:Entropie (informatietheorie) pl:Entropia (teoria informacji) ru:Информационная энтропия th:เอนโทรปีของข้อมูล zh:熵 (信息论)