- Skew-symmetric matrix
In

linear algebra , a**skew-symmetric**(or**antisymmetric**)**matrix**is asquare matrix "A" whosetranspose is also its negative; that is, it satisfies the equation::"A"

^{T}= −"A"or in component form, if "A" = ("a"

_{"ij"})::"a"

_{"ij"}= − "a"_{"ji"}for all "i" and "j".For example, the following matrix is skew-symmetric:

:$egin\{bmatrix\}0\; 2\; -1\; \backslash -2\; 0\; -4\; \backslash 1\; 4\; 0end\{bmatrix\}.$

Compare this with a

symmetric matrix whose transpose is the same as the matrix:$A^T=A,,$

or to an

orthogonal matrix , the transpose of which is equal to its inverse::$A^T=A^\{-1\}.,$

**Properties**Sums and scalar products of skew-symmetric matrices are again skew-symmetric. Hence, the skew-symmetric matrices form a

vector space . Its dimension is $frac\{nleft(n-1\; ight)\}\{2\}$.If matrix "A" is skew-symmetric, and "B" is an arbitrary matrix, then the triple product "B"

^{T}"AB" is skew-symmetric.The "skew-symmetric component" of a square matrix "A" is the matrix $B=\; frac\{1\}\{2\}left(A-A^\{T\}\; ight)$; the "symmetric component" of "A" is $C=\; frac\{1\}\{2\}left(A+A^\{T\}\; ight)$; the matrix "A" is the sum of its symmetric and skew-symmetric components.

"A" is skew-symmetric if and only if "x"

^{T}"Ax" = 0 for all real vectors "x". - Proof NeededAll

main diagonal entries of a skew-symmetric matrix have to be zero, and so the trace is zero.**The determinant of a skew-symmetric matrix**Let "A" be a "n"×"n" skew-symmetric matrix. The

determinant of "A" satisfies:det("A") = det("A"

^{T}) = det(−"A") = (−1)^{"n"}det("A").In particular, if "n" is odd the determinant vanishes. This result is called

**Jacobi's theorem**, afterCarl Gustav Jacobi (Eves, 1980).The even-dimensional case is more interesting. It turns out that the determinant of "A" for "n" even can be written as the square of a

polynomial in the entries of "A" (Theorem by Thomas Muir)::det("A") = Pf("A")

^{2}.This polynomial is called the "

Pfaffian " of "A" and is denoted Pf("A"). Thus the determinant of a real skew-symmetric matrix is always non-negative.**Spectral theory**The

eigenvalue s of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). For a real skew-symmetric matrix the nonzero eigenvalues are all pure imaginary and thus are of the form "i"λ_{1}, −"i"λ_{1}, "i"λ_{2}, −"i"λ_{2}, … where each of the λ_{"k"}are real.Real skew-symmetric matrices are normal matrices (they commute with their adjoints) and are thus subject to the

spectral theorem , which states that any real skew-symmetric matrix can be diagonalized by aunitary matrix . Since the eigenvalues of a real skew-symmetric matrix are complex it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a block diagonal form by an orthogonal transformation. Specifically, every 2"n" × 2"n" real skew-symmetric matrix can be written in the form "A" = "Q" Σ "Q"^{T}where "Q" is orthogonal and:$Sigma\; =\; egin\{bmatrix\}egin\{matrix\}0\; lambda\_1\backslash \; -lambda\_1\; 0end\{matrix\}\; 0\; cdots\; 0\; \backslash 0\; egin\{matrix\}0\; lambda\_2\backslash \; -lambda\_2\; 0end\{matrix\}\; 0\; \backslash vdots\; ddots\; vdots\; \backslash 0\; 0\; cdots\; egin\{matrix\}0\; lambda\_r\backslash \; -lambda\_r\; 0end\{matrix\}\; \backslash \; egin\{matrix\}0\; \backslash \; ddots\; \backslash \; 0\; end\{matrix\}end\{bmatrix\}$for real λ_{"k"}. The nonzero eigenvalues of this matrix are ±"i"λ_{"k"}. In the odd-dimensional case Σ always has at least one row and column of zeros.**Alternating forms**An

**alternating form**φ on avector space "V" over a field "K" is defined (if "K" doesn't have characteristic 2) to be abilinear form : φ : "V" × "V" → "K"

such that

: φ("v","w") = −φ("w","v").

Such a φ will be represented by a skew-symmetric matrix "A", "φ(v, w) = v

^{T}Aw", once a basis of "V" is chosen; and conversely an "n"×"n" skew-symmetric matrix "A" on "K"^{"n"}gives rise to an alternating form sending "x" to "x^{T}Ax".**Infinitesimal rotations**Skew-symmetric matrices form the

tangent space to theorthogonal group O("n") at the identity matrix. In a sense, then, skew-symmetric matrices can be thought of as "infinitesimal rotations".Another way of saying this is that the space of skew-symmetric matrices forms the

Lie algebra o("n") of theLie group O("n").The Lie bracket on this space is given by thecommutator ::$[A,B]\; =\; AB\; -\; BA.,$

It is easy to check that the commutator of two skew-symmetric matrices is again skew-symmetric.

The

matrix exponential of a skew-symmetric matrix "A" is then anorthogonal matrix "R"::$R=exp(A)=sum\_\{n=0\}^infty\; frac\{A^n\}\{n!\}.$

The image of the

exponential map of a Lie algebra always lies in the connected component of the Lie group that contains the identity element. In the case of the Lie group O("n"), this connected component is thespecial orthogonal group SO("n"), consisting of all orthogonal matrices with determinant 1. So "R" = exp("A") will have determinant +1. It turns out that "every" orthogonal matrix with unit determinant can be written as the exponential of some skew-symmetric matrix.**ee also***

Skew-Hermitian matrix

*Symplectic matrix

*Symmetry in mathematics **References*** cite book

last=Eves

first=Howard

authorlink=Howard Eves

title=Elementary Matrix Theory

publisher=Dover Publications

year=1980

isbn=978-0-486-63946-8

* SpringerEOM

urlname=S/s085720

title=Skew-symmetric matrix

author=Suprunenko, D. A.**External links*** [

*http://www.ocolon.org/editor/template.php?.matrix_split_symm_skew Template for splitting a matrix online into a symmetric and a skew-symmetric addend*]

* [*http://mathworld.wolfram.com/AntisymmetricMatrix.html Antisymmetric matrix at Wolfram Mathworld*]

*Wikimedia Foundation.
2010.*

### Look at other dictionaries:

Matrix - получить на Академике рабочий купон на скидку Строительный Двор или выгодно matrix купить с бесплатной доставкой на распродаже в Строительный Двор

**Skew-Hermitian matrix**— In linear algebra, a square matrix (or more generally, a linear transformation from a complex vector space with a sesquilinear norm to itself) A is said to be skew Hermitian or antihermitian if its conjugate transpose A * is also its negative.… … Wikipedia**Skew-Hamiltonian matrix**— In linear algebra, skew Hamiltonian matrices are special matrices which correspond to skew symmetric bilinear forms on a symplectic vector space.Let V be a vector space, equipped with a symplectic form Omega. Such a space must be even dimensional … Wikipedia**Symmetric matrix**— In linear algebra, a symmetric matrix is a square matrix, A , that is equal to its transpose:A = A^{T}. ,!The entries of a symmetric matrix are symmetric with respect to the main diagonal (top left to bottom right). So if the entries are written… … Wikipedia**skew-symmetric determinant**— noun : a determinant whose matrix is skew symmetric … Useful english dictionary**skew-symmetric**— /skyooh si me trik/, adj. Math. noting a square matrix that is equal to the negative of its transpose. * * * … Universalium**skew-symmetric**— adjective (Of a matrix) whose entries on one side of the diagonal are the additive inverses of their correspondents on the other side of the diagonal and the elements on the main diagonal are zero Syn: antisymmetric … Wiktionary**skew-symmetric**— | ̷ ̷ ̷ ̷| ̷ ̷ ̷ ̷ adjective of a matrix : such that the element in the rth row and sth column is the negative of the element in the sth row and rth column … Useful english dictionary**Matrix (mathematics)**— Specific elements of a matrix are often denoted by a variable with two subscripts. For instance, a2,1 represents the element at the second row and first column of a matrix A. In mathematics, a matrix (plural matrices, or less commonly matrixes)… … Wikipedia**Matrix exponential**— In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function. Abstractly, the matrix exponential gives the connection between a matrix Lie algebra and the corresponding Lie group.… … Wikipedia**Rotation matrix**— In linear algebra, a rotation matrix is a matrix that is used to perform a rotation in Euclidean space. For example the matrix rotates points in the xy Cartesian plane counterclockwise through an angle θ about the origin of the Cartesian… … Wikipedia