Spectral theorem

Spectral theorem

In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or about matrices. In broad terms the spectral theorem provides conditions under which an operator or a matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This concept of diagonalization is relatively straightforward for operators on finite-dimensional spaces, but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modelled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

Examples of operators to which the spectral theorem applies are self-adjoint operators or more generally normal operators on Hilbert spaces.

The spectral theorem also provides a canonical decomposition, called the spectral decomposition, eigenvalue decomposition, or eigendecomposition, of the underlying vector space on which it acts.

In this article we consider mainly the simplest kind of spectral theorem, that for a self-adjoint operator on a Hilbert space. However, as noted above, the spectral theorem also holds for normal operators on a Hilbert space.

Finite-dimensional case

Hermitian matrices

We begin by considering a Hermitian matrix "A" on a finite-dimensional real or complex inner product space "V" with the standard Hermitian inner product; the Hermitian condition means

: langle A x ,, y angle = langle x ,, A y angle

for all "x", "y" elements of "V".

An equivalent condition is that "A"* = "A", where "A"* is the conjugate transpose of "A". If "A" is a real matrix, this is equivalent to "A"T = "A" (that is, A is a symmetric matrix). The eigenvalues of a Hermitian matrix are real.

Recall that an eigenvector of a linear operator "A" is a (non-zero) vector "x" such that "Ax" = λ"x" for some scalar λ. The value λ is the corresponding eigenvalue.

Theorem. There is an orthonormal basis of "V" consisting of eigenvectors of "A". Each eigenvalue is real.

We provide a sketch of a proof for the case where the underlying field of scalars is the complex numbers.

By the fundamental theorem of algebra, any square matrix with complex entries has at least one eigenvector. Now if "A" is Hermitian with eigenvector "e"1, we can consider the space "K" = span{"e"1}⊥, the orthogonal complement of "e"1 . By Hermiticity, "K" is an invariant subspace of "A". Applying the same argument to "K" shows that "A" has an eigenvector e2 ∈ "K". Finite induction then finishes the proof.

The spectral theorem holds also for symmetric matrices on finite-dimensional real inner product spaces, but the existence of an eigenvector is harder to establish. A real symmetric matrix has real eigenvalues, therefore eigenvectors with real entries.

If one chooses the eigenvectors of "A" as an orthonormal basis, the matrix representation of "A" in this basis is diagonal. Equivalently, "A" can be written as a linear combination of pairwise orthogonal projections, called its spectral decomposition. Let

: V_lambda = {,v in V: A v = lambda v,}

be the eigenspace corresponding to an eigenvalue λ. Note that the definition does not depend on any choice of specific eigenvectors. "V" is the orthogonal direct sum of the spaces "V"λ where the index ranges over eigenvalues. Let "P"λ be the orthogonal projection onto "V"λ and λ1,..., λ"m" the eigenvalues of "A", one can write its spectral decomposition thus:

:A =lambda_1 P_{lambda_1} +cdots+lambda_m P_{lambda_m}.

The spectral decomposition is a special case of the Schur decomposition. It is also a special case of the singular value decomposition.

Normal matrices

The spectral theorem extends to a more general class of matrices. Let "A" be an operator on a finite-dimensional inner product space. "A" is said to be normal if "A"* "A" = "A A"*. One can show that "A" is normal if and only if it is unitarily diagonalizable: By the Schur decomposition, we have "A" = "U T U"*, where "U" is unitary and "T" upper-triangular. Since "A" is normal, "T T"* = "T"* "T". Therefore "T" must be diagonal. The converse is also obvious.

In other words, "A" is normal if and only if there exists a unitary matrix "U" such that

:A=U Lambda U^* ;

where Λ is the diagonal matrix the entries of which are the eigenvalues of "A". The column vectors of "U" are the eigenvectors of "A" and they are orthonormal. Unlike the Hermitian case, the entries of Λ need not be real.

The spectral theorem for compact self-adjoint operators

In Hilbert spaces in general, the statement of the spectral theorem for compact self-adjoint operators is virtually the same as in the finite-dimensional case.

Theorem. Suppose "A" is a compact self-adjoint operator on a Hilbert space "V". There is an orthonormal basis of "V" consisting of eigenvectors of "A". Each eigenvalue is real.

As for Hermitian matrices, the key point is to prove the existence of at least one nonzero eigenvector. To prove this, we cannot rely on determinants to show existence of eigenvalues, but instead one can use a maximization argument analogous to the variational characterization of eigenvalues. The above spectral theorem holds for real or complex Hilbert spaces.

If the compactness assumption is removed, it is not true that every self adjoint operator has eigenvectors.

The spectral theorem for bounded self-adjoint operators

The next generalization we consider is that of bounded self-adjoint operators "A" on a Hilbert space "V". Such operators may have no eigenvalues: for instance let "A" be the operator multiplication by "t" on "L"2 [0, 1] , that is

: [A varphi] (t) = t varphi(t). ;

Theorem. Let "A" be a bounded self-adjoint operator on a Hilbert space "H". Then there is a measure space ("X", Σ, μ) and a real-valued measurable function "f" on "X" and a unitary operator "U":"H" → "L"2μ("X") such that

: U^* T U = A ;

where "T" is the multiplication operator:

: [T varphi] (x) = f(x) varphi(x). ;

This is the beginning of the vast research area of functional analysis called operator theory.

There is also an analogous spectral theorem for bounded normal operators on Hilbert spaces. The only difference in the conclusion is that now f may be complex-valued.

An alternative formulation of the spectral theorem expresses the operator A as an integral of the coordinate function over the operator's spectrum with respect to a projection-valued measure. When the normal operator in question is compact, this version of the spectral theorem reduces to the finite-dimensional spectral theorem above, except that the operator is expressed as a linear combination of possibly infinitely many projections.

The spectral theorem for general self-adjoint operators

Many important linear operators which occur in analysis, such as differential operators, are unbounded. There is also a spectral theorem for self-adjoint operators that applies in these cases. To give an example, any constant coefficient differential operator is unitarily equivalent to a multiplication operator. Indeed the unitary operator that implements this equivalence is the Fourier transform; the multiplication operator is a type of Fourier multiplier.

ee also

* Matrix decomposition
* Canonical form
* Jordan decomposition, of which the spectral decomposition is a special case.
* Singular value decomposition, a generalisation of spectral theorem to arbitrary matrices.
* Eigendecomposition of a matrix

References

* Sheldon Axler, "Linear Algebra Done Right", Springer Verlag, 1997
* Paul Halmos, "What Does the Spectral Theorem Say?", "American Mathematical Monthly", volume 70, number 3 (1963), pages 241–247


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Spectral theory of ordinary differential equations — In mathematics, the spectral theory of ordinary differential equations is concerned with the determination of the spectrum and eigenfunction expansion associated with a linear ordinary differential equation. In his dissertation Hermann Weyl… …   Wikipedia

  • Spectral theory — In mathematics, spectral theory is an inclusive term for theories extending the eigenvector and eigenvalue theory of a single square matrix. The name was introduced by David Hilbert in his original formulation of Hilbert space theory, which was… …   Wikipedia

  • Spectral efficiency — Spectral efficiency, spectrum efficiency or bandwidth efficiency refers to the information rate that can be transmitted over a given bandwidth in a specific communication system. It is a measure of how efficiently a limited frequency spectrum is… …   Wikipedia

  • Spectral method — Spectral methods are a class of techniques used in applied mathematics and scientific computing to numerically solve certain Dynamical Systems, often involving the use of the Fast Fourier Transform. Where applicable, spectral methods have… …   Wikipedia

  • Spectral radius — In mathematics, the spectral radius of a matrix or a bounded linear operator is the supremum among the absolute values of the elements in its spectrum, which is sometimes denoted by rho;( middot;).pectral radius of a matrixLet lambda;1, hellip;,… …   Wikipedia

  • Spectral theory of compact operators — In functional analysis, compact operators are linear operators that map bounded sets to precompact ones. Compact operators acting on a Hilbert space H is the closure of finite rank operators in the uniform operator topology. In general, operators …   Wikipedia

  • Spectral density — In statistical signal processing and physics, the spectral density, power spectral density (PSD), or energy spectral density (ESD), is a positive real function of a frequency variable associated with a stationary stochastic process, or a… …   Wikipedia

  • Spectral graph theory — In mathematics, spectral graph theory is the study of properties of a graph in relationship to the characteristic polynomial, eigenvalues, and eigenvectors of its adjacency matrix or Laplacian matrix. An undirected graph has a symmetric adjacency …   Wikipedia

  • Spectral phase interferometry for direct electric-field reconstruction — In ultrafast optics, spectral phase interferometry for direct electric field reconstruction (SPIDER) is an ultrashort pulse measurement technique.The basicsSPIDER is an interferometric ultrashort pulse measurement technique in the frequency… …   Wikipedia

  • Spectral asymmetry — In mathematics and physics, the spectral asymmetry is the asymmetry in the distribution of the spectrum of eigenvalues of an operator. In mathematics, the spectral asymmetry arises in the study of elliptic operators on compact manifolds, and is… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”