Cochran's theorem

Cochran's theorem

In statistics, Cochran's theorem, devised by William G. Cochran,[1] is a theorem used in to justify results relating to the probability distributions of statistics that are used in the analysis of variance.[2]

Contents

Statement

Suppose U1, ..., Un are independent standard normally distributed random variables, and an identity of the form


\sum_{i=1}^n U_i^2=Q_1+\cdots + Q_k

can be written, where each Qi is a sum of squares of linear combinations of the Us. Further suppose that


r_1+\cdots +r_k=n

where ri is the rank of Qi. Cochran's theorem states that the Qi are independent, and each Qi has a chi-square distribution with ri degrees of freedom.[citation needed]

Here the rank of Qi should be interpreted as meaning the rank of the matrix B(i), with elements Bj,k(i), in the representation of Qi as a quadratic form:

Q_i=\sum_{j=1}^n\sum_{k=1}^n U_j B_{j,k}^{(i)} U_k .

Less formally, it is the number of linear combinations included in the sum of squares defining Qi, provided that these linear combinations are linearly independent.

Examples

Sample mean and sample variance

If X1, ..., Xn are independent normally distributed random variables with mean μ and standard deviation σ then

U_i = \frac{X_i-\mu}{\sigma}

is standard normal for each i. It is possible to write


\sum U_i^2=\sum\left(\frac{X_i-\overline{X}}{\sigma}\right)^2
+ n\left(\frac{\overline{X}-\mu}{\sigma}\right)^2

(here, summation is from 1 to n, that is over the observations). To see this identity, multiply throughout by σ2 and note that


\sum(X_i-\mu)^2=
\sum(X_i-\overline{X}+\overline{X}-\mu)^2

and expand to give


\sum(X_i-\mu)^2=
\sum(X_i-\overline{X})^2+\sum(\overline{X}-\mu)^2+
2\sum(X_i-\overline{X})(\overline{X}-\mu).

The third term is zero because it is equal to a constant times

\sum(\overline{X}-X_i)=0,

and the second term has just n identical terms added together. Thus


\sum(X_i-\mu)^2=
\sum(X_i-\overline{X})^2+n(\overline{X}-\mu)^2 ,

and hence


\sum\left(\frac{X_i-\mu}{\sigma}\right)^2=
\sum\left(\frac{X_i-\overline{X}}{\sigma}\right)^2
+n\left(\frac{\overline{X}-\mu}{\sigma}\right)^2
=Q_1+Q_2.

Now the rank of Q2 is just 1 (it is the square of just one linear combination of the standard normal variables). The rank of Q1 can be shown to be n − 1, and thus the conditions for Cochran's theorem are met.

Cochran's theorem then states that Q1 and Q2 are independent, with chi-squared distributions with n − 1 and 1 degree of freedom respectively. This shows that the sample mean and sample variance are independent. This can also be shown by Basu's theorem, and in fact this property characterizes the normal distribution – for no other distribution are the sample mean and sample variance independent.[citation needed]

Distributions

The result for the distributions is written symbolically as


n(\overline{X}-\mu)^2\sim \sigma^2 \chi^2_1,

\sum\left(X_i-\overline{X}\right)^2  \sim \sigma^2 \chi^2_{n-1}.

Both these random variables are proportional to the true but unknown variance σ2. Thus their ratio is does not depend on σ2 and, because they are statistically independent, the distribution of their ratio is given by


\frac{n\left(\overline{X}-\mu\right)^2}
{\frac{1}{n-1}\sum\left(X_i-\overline{X}\right)^2}\sim \frac{\chi^2_1}{\frac{1}{n-1}\chi^2_{n-1}}
   \sim F_{1,n-1}

where F1,n − 1 is the F-distribution with 1 and n − 1 degrees of freedom (see also Student's t-distribution). The final step here is effectively the defintion of a random variable having the F-distribution.

Estimation of variance

To estimate the variance σ2, one estimator that is sometimes used is the maximum likelihood estimator of the variance of a normal distribution


\widehat{\sigma}^2=
\frac{1}{n}\sum\left(
X_i-\overline{X}\right)^2.

Cochran's theorem shows that


\frac{n\widehat{\sigma}^2}{\sigma^2}\sim\chi^2_{n-1}

and the properties of the chi-square distribution show that the expected value of \widehat{\sigma}^2 is σ2(n − 1)/n.

Alternative formulation

The following version is often seen when considering linear regression.[citation needed] Suppose that YNn(0,σ2In) is a standard multivariate normal random vector (here In denotes the n-by-n identity matrix), and if A_1,\ldots,A_k are all n-by-n symmetric matrices with \sum_{i=1}^kA_i=I_n. Then, on defining ri = Rank(Ai), any one of the following conditions implies the other two:

See also

References

  1. ^ Cochran, W. G. (April 1934). "The distribution of quadratic forms in a normal system, with applications to the analysis of covariance". Mathematical Proceedings of the Cambridge Philosophical Society 30 (2): 178–191. doi:10.1017/S0305004100016595. 
  2. ^ Bapat, R. B. (2000). Linear Algebra and Linear Models (Second ed.). Springer. ISBN 9780387988719. 

Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • William Gemmell Cochran — (15 July 1909 – 29 March 1980) was a prominent statistician; he was born in Scotland but spent most of his life in the United States. Cochran studied mathematics at the University of Glasgow and the University of Cambridge. He worked at… …   Wikipedia

  • Cramér's theorem — In mathematical statistics, Cramér s theorem (or Cramér’s decomposition theorem) is one of several theorems of Harald Cramér, a Swedish statistician and probabilist. Contents 1 Normal random variables 2 Large deviations 3 Slut …   Wikipedia

  • List of mathematics articles (C) — NOTOC C C closed subgroup C minimal theory C normal subgroup C number C semiring C space C symmetry C* algebra C0 semigroup CA group Cabal (set theory) Cabibbo Kobayashi Maskawa matrix Cabinet projection Cable knot Cabri Geometry Cabtaxi number… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Normal distribution — This article is about the univariate normal distribution. For normally distributed vectors, see Multivariate normal distribution. Probability density function The red line is the standard normal distribution Cumulative distribution function …   Wikipedia

  • List of theorems — This is a list of theorems, by Wikipedia page. See also *list of fundamental theorems *list of lemmas *list of conjectures *list of inequalities *list of mathematical proofs *list of misnamed theorems *Existence theorem *Classification of finite… …   Wikipedia

  • Ronald Fisher — R. A. Fisher Born 17 February 1890(1890 02 17) East Finchley, London …   Wikipedia

  • Indecomposable distribution — In probability theory, an indecomposable distribution is a probability distribution that cannot be represented as the distribution of the sum of two or more non constant independent random variables: Z ≠ X + Y. If it can be so …   Wikipedia

  • Optimal design — This article is about the topic in the design of experiments. For the topic in optimal control theory, see shape optimization. Gustav Elfving developed the optimal design of experiments, and so minimized surveyors need for theodolite measurements …   Wikipedia

  • Student's t-distribution — Probability distribution name =Student s t type =density pdf cdf parameters = u > 0 degrees of freedom (real) support =x in ( infty; +infty)! pdf =frac{Gamma(frac{ u+1}{2})} {sqrt{ upi},Gamma(frac{ u}{2})} left(1+frac{x^2}{ u} ight)^{ (frac{… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”