Joint entropy

Joint entropy

The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like other entropies, the joint entropy can be measured in bits, nits, or hartleys depending on the base of the logarithm.

Background

Given a random variable X, the entropy H(X) describes our uncertainty about the value of X. If X consists of several events x, which each occur with probability p_x, then the entropy of X is

:H(X) = -sum_x p_x log_2(p_x) !

Consider another random variable Y, containing events y occurring with probabilities p_y. Y has entropy H(Y).

However, if X and Y describe related events, the total entropy of the system may not be H(X)+H(Y). For example, imagine we choose an integer between 1 and 8, with equal probability for each integer. Let X represent whether the integer is even, and Y represent whether the integer is prime. One-half of the integers between 1 and 8 are even, and one-half are prime, so H(X)=H(Y)=1. However, if we know that the integer is even, there is only a 1 in 4 chance that it is also prime; the distributions are related. The total entropy of the system is less than 2 bits. We need a way of measuring the total entropy of both systems.

Definition

We solve this by considering each "pair" of possible outcomes (x,y). If each pair of outcomes occurs with probability p_{x,y}, the joint entropy is defined as

:H(X,Y) = -sum_{x,y} p_{x,y} log_2(p_{x,y}) !

In the example above we are not considering 1 as a prime. Then the joint probability distribution becomes:

P(even,prime)=P(odd,not prime)=1/8 quad

P(even,not prime)=P(odd,prime)=3/8 quad

Thus, the joint entropy is

-2frac{1}{8}log_2(1/8) -2frac{3}{8}log_2(3/8) approx 1.8 bits.

Properties

Greater than subsystem entropies

The joint entropy is always at least equal to the entropies of the original system; adding a new system can never reduce the available uncertainty.

:H(X,Y) geq H(X)

This inequality is an equality if and only if Y is a (deterministic) function of X.

if Y is a (deterministic) function of X, we also have

:H(X) geq H(Y)

ubadditivity

Two systems, considered together, can never have more entropy than the sum of the entropy in each of them. This is an example of subadditivity.

:H(X,Y) leq H(X) + H(Y)

This inequality is an equality if and only if X and Y are statistically independent.

Bounds

Like other entropies, H(X,Y) geq 0 always.

Relations to Other Entropy Measures

The joint entropy is used in the definitions of the conditional entropy:

:H(X|Y) = H(X,Y) - H(Y),

and the mutual information:

:I(X;Y) = H(X) + H(Y) - H(X,Y),

In quantum information theory, the joint entropy is generalized into the joint quantum entropy.

References

#


Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Joint quantum entropy — The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy rate — The entropy rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of …   Wikipedia

  • Cross entropy — In information theory, the cross entropy between two probability distributions measures the average number of bits needed to identify an event from a set of possibilities, if a coding scheme is used based on a given probability distribution q,… …   Wikipedia

  • Conditional quantum entropy — The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on… …   Wikipedia

  • Maximum entropy probability distribution — In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other members of a specified class of distributions. According to the principle of… …   Wikipedia

  • Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… …   Wikipedia

  • Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… …   Wikipedia

  • Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”