- Joint entropy
The joint entropy is an entropy measure used in
information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are and , the joint entropy is written . Like other entropies, the joint entropy can be measured in bits, nits, or hartleys depending on the base of the logarithm.
Given a random variable , the entropy describes our uncertainty about the value of . If consists of several events , which each occur with probability , then the entropy of is
Consider another random variable , containing events occurring with probabilities . has entropy .
However, if and describe related events, the total entropy of the system may not be . For example, imagine we choose an
integerbetween 1 and 8, with equal probability for each integer. Let represent whether the integer is even, and represent whether the integer is prime. One-half of the integers between 1 and 8 are even, and one-half are prime, so . However, if we know that the integer is even, there is only a 1 in 4 chance that it is also prime; the distributions are related. The total entropy of the system is less than 2 bits. We need a way of measuring the total entropy of both systems.
We solve this by considering each "pair" of possible outcomes . If each pair of outcomes occurs with probability , the joint entropy is defined as
In the example above we are not considering 1 as a prime. Then the joint probability distribution becomes:
Thus, the joint entropy is
Greater than subsystem entropies
The joint entropy is always at least equal to the entropies of the original system; adding a new system can never reduce the available uncertainty.
This inequality is an equality if and only if is a (deterministic) function of .
if is a (deterministic) function of , we also have
Two systems, considered together, can never have more entropy than the sum of the entropy in each of them. This is an example of
This inequality is an equality if and only if and are
Like other entropies, always.
Relations to Other Entropy Measures
The joint entropy is used in the definitions of the
quantum information theory, the joint entropy is generalized into the joint quantum entropy.
Wikimedia Foundation. 2010.
Look at other dictionaries:
Joint quantum entropy — The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint… … Wikipedia
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the … Wikipedia
Entropy rate — The entropy rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of … Wikipedia
Cross entropy — In information theory, the cross entropy between two probability distributions measures the average number of bits needed to identify an event from a set of possibilities, if a coding scheme is used based on a given probability distribution q,… … Wikipedia
Conditional quantum entropy — The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on… … Wikipedia
Maximum entropy probability distribution — In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other members of a specified class of distributions. According to the principle of… … Wikipedia
Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… … Wikipedia
Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… … Wikipedia
Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… … Wikipedia