Canonical correlation

Canonical correlation

In statistics, canonical correlation analysis, introduced by Harold Hotelling, is a way of making sense of cross-covariance matrices.

Definition

Given two column vectors X = (x_1, dots, x_n)' and Y = (y_1, dots, y_m)' of random variables with finite second moments, one may define the cross-covariance Sigma _{12} = operatorname{cov}(X, Y) to be the n imes m matrix whose (i, j) entry is the covariance operatorname{cov}(x_i, y_j).

Canonical correlation analysis seeks vectors a and b such that the random variables a' X and b' Y maximize the correlation ho = operatorname{cor}(a' X, b' Y). The random variables U = a' X and V = b' Y are the "first pair of canonical variables". Then one seeks vectors maximizing the same correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables; this gives the "second pair of canonical variables". This procedure continues min{m,n} times.

Computation

Proof

Let Sigma _{11} = operatorname{cov}(X, X) and Sigma _{22} = operatorname{cov}(Y, Y). The parameter to maximize is

: ho = frac{a' Sigma _{12} b}{sqrt{a' Sigma _{11} a} sqrt{b' Sigma _{22} b.

The first step is to define a change of basis and define

:c = Sigma _{11} ^{1/2} a,

:d = Sigma _{22} ^{1/2} b.

And thus we have

: ho = frac{c' Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1/2} d}{sqrt{c' c} sqrt{d' d.

By the Cauchy-Schwarz inequality, we have

:c' Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1/2} d leq left(c' Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1/2} Sigma _{22} ^{-1/2} Sigma _{21} Sigma _{11} ^{-1/2} c ight)^{1/2} left(d' d ight)^{1/2},

: ho leq frac{left(c' Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1/2} Sigma _{22} ^{-1/2} Sigma _{21} Sigma _{11} ^{-1/2} c ight)^{1/2{left(c' c ight)^{1/2.

There is equality if the vectors d and Sigma _{22} ^{-1/2} Sigma _{21} Sigma _{11} ^{-1/2} c are colinear. In addition, the maximum of correlation is attained if c is the eigenvector with the maximum eigenvalue for the matrix Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1} Sigma _{21} Sigma _{11} ^{-1/2} (see Rayleigh quotient). The subsequent pairs are found by using eigenvalues of decreasing magnitudes. Orthogonality is guaranteed by the symmetry of the correlation matrices.

olution

The solution is therefore:
* c is an eigenvector of Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1} Sigma _{21} Sigma _{11} ^{-1/2}
* d is proportional to Sigma _{22} ^{-1/2} Sigma _{21} Sigma _{11} ^{-1/2} c

Reciprocally, there is also:
* d is an eigenvector of Sigma _{22} ^{-1/2} Sigma _{21} Sigma _{11} ^{-1} Sigma _{12} Sigma _{22} ^{-1/2}
* c is proportional to Sigma _{11} ^{-1/2} Sigma _{12} Sigma _{22} ^{-1/2} d

Reversing the change of coordinates, we have that
* a is an eigenvector of Sigma _{11} ^{-1} Sigma _{12} Sigma _{22} ^{-1} Sigma _{21}
* b is an eigenvector of Sigma _{22} ^{-1} Sigma _{21} Sigma _{11} ^{-1} Sigma _{12}
* a is proportional to Sigma _{11} ^{-1} Sigma _{12} b
* b is proportional to Sigma _{22} ^{-1} Sigma _{21} a

The canonical variables are defined by:

:U = c' Sigma _{11} ^{-1/2} X = a' X

:V = d' Sigma _{22} ^{-1/2} Y = b' Y

Hypothesis testing

Each row can be tested for significance with the following method. If we have p independent observations in a sample and widehat{ ho}_i is the estimated correlation for i = 1,dots, min{m,n}. For the ith row, the test statistic is:

:chi ^2 = - left( p - 1 - frac{1}{2}(m + n + 1) ight) ln prod _ {j = i} ^p (1 - widehat{ ho}_j^2),

which is asymptotically distributed as a chi-square with (m - i + 1)(n - i + 1) degrees of freedom for large p. [Cite book
author = Kanti V. Mardia, J. T. Kent and J. M. Bibby
title = Multivariate Analysis
year = 1979
publisher = Academic Press
]

Practical uses

A typical use for canonical correlation in the psychological context is to take a two sets of variables and see what is common amongst the two sets. For example you could take two well established multidimensional personality tests such as the MMPI and the NEO. By seeing how the MMPI factors relate to the NEO factors, you could gain insight into what dimensions were common between the tests and how much variance was shared. For example you might find that an extraversion or neuroticism dimension accounted for a substantial amount of shared variance between the two tests.

References and links

* See also generalized canonical correlation.
* "Applied Multivariate Statistical Analysis", Fifth Edition, Richard Johnson and Dean Wichern
* "Canonical correlation analysis - An overview with application to learning methods" http://eprints.ecs.soton.ac.uk/9225/01/tech_report03.pdf, pages 5-9 give a good introduction [http://homepage.mac.com/davidrh/_papers/NC_Hardoon_2817_reg.pdf Neural Computation (2004) version]
* [http://factominer.free.fr/ FactoMineR] (free exploratory multivariate data analysis software linked to R)


Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Generalized canonical correlation — In statistics, the generalized canonical correlation analysis (gCCA), is a way of making sense of cross correlation matrices between the sets of random variables when there are more than two sets. It is a generalization of the Principal component …   Wikipedia

  • Canonical analysis — In statistics, canonical analysis (from Gk. κανων bar, measuring rod, ruler) belongs to the family of regression methods for data analysis. Regression analysis quantifies a relationship between a predictor variable and a criterion variable by the …   Wikipedia

  • Correlation and dependence — This article is about correlation and dependence in statistical data. For other uses, see correlation (disambiguation). In statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation …   Wikipedia

  • CCA — • canonical correlation; • cephalin cholesterol antigen; • chick cell agglutination; • chimpanzee coryza agent; • choriocarcinoma; • chromated copper arsenate; • circulating cathodic antigen; • circumflex coronary artery; • common carotid artery; …   Dictionary of medical acronyms & abbreviations

  • List of mathematics articles (C) — NOTOC C C closed subgroup C minimal theory C normal subgroup C number C semiring C space C symmetry C* algebra C0 semigroup CA group Cabal (set theory) Cabibbo Kobayashi Maskawa matrix Cabinet projection Cable knot Cabri Geometry Cabtaxi number… …   Wikipedia

  • Multivariate statistics — is a form of statistics encompassing the simultaneous observation and analysis of more than one statistical variable. The application of multivariate statistics is multivariate analysis. Methods of bivariate statistics, for example simple linear… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Factor analysis — is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved, uncorrelated variables called factors. In other words, it is possible, for example, that variations in …   Wikipedia

  • Singular value decomposition — Visualization of the SVD of a 2 dimensional, real shearing matrix M. First, we see the unit disc in blue together with the two canonical unit vectors. We then see the action of M, which distorts the disk to an ellipse. The SVD decomposes M into… …   Wikipedia

  • List of psychology topics — This page aims to list all topics related to psychology. This is so that those interested in the subject can monitor changes to the pages by clicking on Related changes in the sidebar. It is also to see the gaps in Wikipedia s coverage of the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”