# Independent and identically-distributed random variables

﻿
Independent and identically-distributed random variables

:"IID" or "iid" redirects here. For other uses, see:" IID (disambiguation).

In probability theory, a sequence or other collection of random variables is independent and identically distributed (i.i.d.) if each has the same probability distribution as the others and all are mutually independent.

The abbreviation "i.i.d." is particularly common in statistics (often as "iid", sometimes written "IID"), where observations in a sample are often assumed to be (more-or-less) i.i.d. for the purposes of statistical inference. The assumption (or requirement) that observations be i.i.d. tends to simplify the underlying mathematics of many statistical methods. However, in practical applications this may or may not be realistic.

This is important in the classical form of the central limit theorem, which states that the probability distribution of the sum (or average) of i.i.d. variables with finite variance approaches a normal distribution.

Examples

The following are examples or applications of independent and identically distributed (i.i.d.) random variables:

*All other things being equal, a sequence of outcomes of spins of a roulette wheel is i.i.d. From a practical point of view, an important implication of this is that if the roulette ball lands on 'red', for example, 20 times in a row, the next spin is no more or less likely to be 'black' than on any other spin.

*All other things being equal, a sequence of dice rolls is i.i.d.

*All other things being equal, a sequence of coin flips is i.i.d.

*One of the simplest statistical tests, the "z"-test, is used to test hypotheses about means of random variables. When using the "z"-test, one assumes (requires) that all observations are i.i.d. in order to satisfy the conditions of the central limit theorem.

*In signal processing and image processing the notion of transformation to IID implies two specifications, the "ID" part and the "I" part: (ID) the signal level must be balanced on the time axis, and (I=Independent) the signal spectrum must be flattened, i.e. transformed by filtering (such as deconvolution) to a white signal (one where all frequencies are equally present).

ee also

*De Finetti's theorem
*Chebyshev's theorem

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• Exchangeable random variables — An exchangeable sequence of random variables is asequence X 1, X 2, X 3, ... of random variables such that for any finite permutation sigma; of the indices 1, 2, 3, ..., i.e. any permutation sigma; that leaves all but finitely many indices fixed …   Wikipedia

• Random sequence — A random sequence is a kind of stochastic process. In short, a random sequence is a sequence of random variables. Random sequences are essential in statistics. The statistical analysis of any experiment usually begins with the words let X 1,...,… …   Wikipedia

• Random matrix — In probability theory and mathematical physics, a random matrix is a matrix valued random variable. Many important properties of physical systems can be represented mathematically as matrix problems. For example, the thermal conductivity of a… …   Wikipedia

• Notation in probability and statistics — Probability theory and statistics has some commonly used conventions of its own, in addition to standard mathematical notation and mathematical symbols. Contents 1 Probability theory 2 Statistics 3 Critical values …   Wikipedia

• Errors-in-variables models — In statistics and econometrics, errors in variables models or measurement errors models are regression models that account for measurement errors in the independent variables. In contrast, standard regression models assume that those regressors… …   Wikipedia

• Central limit theorem — This figure demonstrates the central limit theorem. The sample means are generated using a random number generator, which draws numbers between 1 and 100 from a uniform probability distribution. It illustrates that increasing sample sizes result… …   Wikipedia

• Negative binomial distribution — Probability mass function The orange line represents the mean, which is equal to 10 in each of these plots; the green line shows the standard deviation. notation: parameters: r > 0 number of failures until the experiment is stopped (integer,… …   Wikipedia

• Bapat–Beg theorem — In probability theory, the Bapat–Beg theorem R. B. Bapat and M. I. Beg. Order statistics for nonidentically distributed variables and permanents. Sankhyā Ser. A , 51(1):79 ndash;93, 1989. [http://www.ams.org/mathscinet getitem?mr=1065561… …   Wikipedia

• Compound Poisson distribution — In probability theory, a compound Poisson distribution is the probability distribution of the sum of a Poisson distributed number of independent identically distributed random variables. In the simplest cases, the result can be either a… …   Wikipedia

• Illustration of the central limit theorem — This article gives two concrete illustrations of the central limit theorem. Both involve the sum of independent and identically distributed random variables and show how the probability distribution of the sum approaches the normal distribution… …   Wikipedia