Fine Art

.

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Let A be a symmetric matrix. Then:

\( A = A^{\top}. \,\! \)

The entries of a symmetric matrix are symmetric with respect to the main diagonal (top left to bottom right). So if the entries are written as A = (aij), then

\( a_{ij} = a_{ji} \,\! \)

for all indices i and j. The following 3×3 matrix is symmetric:

\( \begin{bmatrix} 1 & 7 & 3\\ 7 & 4 & -5\\ 3 & -5 & 6\end{bmatrix}. \)

Every diagonal matrix is symmetric, since all off-diagonal entries are zero. Similarly, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.

In linear algebra, a real symmetric matrix represents a self-adjoint operator over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.

Properties

The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every symmetric real matrix A there exists a real orthogonal matrix Q such that D = QTAQ is a diagonal matrix. Every symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix.

Another way to phrase the spectral theorem is that a real n×n matrix A is symmetric if and only if there is an orthonormal basis of \( \mathbb{R}^n \) consisting of eigenvectors for A.

Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the above diagonal matrix D, and therefore D is uniquely determined by A up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.

A complex symmetric matrix A can often, but not always, be diagonalized in the form D = UT A U, where D is complex diagonal and U is not Hermitian but complex orthogonal with UT U = I. In this case the columns of U are the eigenvectors of A and the diagonal elements of D are eigenvalues. An example of a complex symmetric matrix that cannot be diagonalized is

\( \begin{bmatrix} i& 1 \\ 1& -i\end{bmatrix}. \)

The sum and difference of two symmetric matrices is again symmetric, but this is not always true for the product: given symmetric matrices A and B, then AB is symmetric if and only if A and B commute, i.e., if AB = BA. So for integer n, An is symmetric if A is symmetric. Two real symmetric matrices commute if and only if they have the same eigenspaces.

If A−1 exists, it is symmetric if and only if A is symmetric.

Let Matn denote the space of n × n matrices. A symmetric n × n matrix is determined by n(n + 1)/2 scalars (the number of entries on or above the main diagonal). Similarly, a skew-symmetric matrix is determined by n(n − 1)/2 scalars (the number of entries above the main diagonal). If Symn denotes the space of n × n symmetric matrices and Skewn the space of n × n skew-symmetric matrices then since Matn = Symn + Skewn and Symn ∩ Skewn = {0}, i.e.

\( \mbox{Mat}_n = \mbox{Sym}_n \oplus \mbox{Skew}_n , \)

where ⊕ denotes the direct sum. Let X ∈ Matn then

\( X = \frac{1}{2}(X + X^{\top}) + \frac{1}{2}(X - X^{\top}) . \)

Notice that ½(X + XT) ∈ Symn and ½(X − XT) ∈ Skewn. This is true for every square matrix X with entries from any field whose characteristic is different from 2.

Any matrix congruent to a symmetric matrix is again symmetric: if X is a symmetric matrix then so is AXAT for any matrix A.

Denote with \( \langle \cdot,\cdot \rangle \) the standard inner product on Rn. The real n-by-n matrix A is symmetric if and only if

\( \langle Ax,y \rangle = \langle x, Ay\rangle \quad \forall x,y\in\Bbb{R}^n. \)

Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, in differential geometry, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert spaces.

A symmetric matrix is a normal matrix.
Decomposition

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.[1]

Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can also be factored, but not uniquely.

Cholesky decomposition states that every real positive-definite symmetric matrix A is a product of a lower-triangular matrix L and its transpose, \( A=L L^T \). If the matrix is symmetric indefinite, it may be still decomposed as \( P A P^T = L T L^T \) where P is a permutation matrix (arising from the need to pivot), L a lower unit triangular matrix, T a symmetric tridiagonal matrix, and D a direct sum of symmetric 1×1 and 2×2 blocks.[2]

Every real symmetric matrix A can be diagonalized, moreover the eigen decomposition takes a simpler form:

\( A = Q \Lambda Q^{\top} \)

where Q is an orthogonal matrix (the columns of which are eigenvectors of A), and Λ is real and diagonal (having the eigenvalues of A on the diagonal).
Hessian

Symmetric real n-by-n matrices appear as the Hessian of twice continuously differentiable functions of n real variables.

Every quadratic form q on Rn can be uniquely written in the form q(x) = xTAx with a symmetric n-by-n matrix A. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of Rn, "looks like"

\( q(x_1,\ldots,x_n)=\sum_{i=1}^n \lambda_i x_i^2 \)

with real numbers λi. This considerably simplifies the study of quadratic forms, as well as the study of the level sets {x : q(x) = 1} which are generalizations of conic sections.

This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.
Symmetrizable matrix

An n-by-n matrix A is said to be symmetrizable if there exist an invertible diagonal matrix D and symmetric matrix S such that A = DS. The transpose of a symmetrizable matrix is symmetrizable, for (DS)T = D−T(DTSD). A matrix A = (aij) is symmetrizable if and only if the following conditions are met:

\( a_{ij} = 0 \text{ implies } a_{ji} = 0 \text{ for all } 1 \le i \le j \le n.
a_{i_1i_2} a_{i_2i_3}\dots a_{i_ki_1} = a_{i_2i_1} a_{i_3i_2}\dots a_{i_1i_k} \text{ for any finite sequence } (i_1, i_2, \dots, i_k). \)

See also

Other types of symmetry or pattern in square matrices have special names; see for example:

Antimetric matrix
Centrosymmetric matrix
Circulant matrix
Covariance matrix
Coxeter matrix
Hankel matrix
Hilbert matrix
Persymmetric matrix
Skew-symmetric matrix
Toeplitz matrix

See also symmetry in mathematics.
References

^ Bosch, A. J. (1986). "The factorization of a square matrix into two symmetric matrices". American Mathematical Monthly 93 (6): 462–464. doi:10.2307/2323471. JSTOR 2323471.
^ G.H. Golub, C.F. van Loan. (1996). Matrix Computations. The Johns Hopkins University Press, Baltimore, London.

External links

A brief introduction and proof of eigenvalue properties of the real symmetric matrix

Mathematics Encyclopedia

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License

Home - Hellenica World