The picture above is an application of singular value decomposition in image processing. This post will discuss a special case of SVD, where the matrix to be decomposed is a symmetric matrix. The study of symmetric matrices is important in order to understand multivariate statistics. The variance-covariance matrix, which is often used in multivariate statistics, is an example of a symmetric matrix. The first result that can be obtained if a matrix is symmetric is shown in the following theorem.

**Theorem 1**

Let A be a *k* × *k* symmetric matrix. Then A has *k* pairs of eigenvalues and eigenvectors namely . The eigenvectors can be chosen such that their norms are 1 and be mutually perpendicular. The eigenvectors are unique unless two or more eigenvalues are equal.

The spectral decomposition of symmetric matrices is performed by applying the following theorem.

**Theorem 2 [Spectral Decomposition of Symmetric Matrices]**

Let A be a *k* × *k* symmetric matrix, are the eigenvalues of A, and are the associated normalized eigenvectors of A which are mutually perpendicular. Then the matrix A can be expressed as:

**Are the eigenvectors obtained from symmetric matrices perpendicular to each other?** The theorem below answers this question.

**Theorem 3**

If A is a symmetric matrix, then eigenvectors from different eigenspaces are orthogonal.

The theorem implies that the eigenvectors referred to in Theorem 2 are not necessarily perpendicular to each other. However, if the eigenvectors come from different eigenspaces, then they are perpendicular to each other.

**Are the eigenvectors obtained from symmetric matrices independent of each other?** Let the next theorem address it.

**Theorem 4**

If are eigenvectors of A obtained corresponding to distinct eigenvalues then are independent.

Referring to Theorem 4. the eigenvectors in Theorem 2 can be dependent. However, if the eigenvectors are obtained from distinct eigenvalues, then the eigenvectors are independent of each other.

**Example**

Consider the matrix A as follows.

What is the spectral decomposition of A? Is the decomposition unique?

**Answer**

It can be shown that the characteristic equation of A is . For we get the eigenspace:

The basis of **E _{1}** with norm 1 is:

For we get the eigenspace:

In order to determine the spectral decomposition of A, we need the eigenvectors that are perpendicular to each other and have norms of 1. In this case, Theorem 3 ensures that the eigenvectors in **E _{1}** are perpendicular to the eigenvectors in

**E**. Now we have to find the eigenvectors in

_{2}**E**that are perpendicular to each other.

_{2}**E**is spanned by and as the basis vectors. To obtain an orthonormal basis for

_{2}**E**, we can apply the Gram-Schmidt process. Normalization of produces one of its orthonormal basis vectors, namely . The projection of on the vector space spanned by is:

_{2}The component of perpendicular to the projection is:

Normalizing this vector results in . The vectors and span the eigenspace **E _{2}**, so is an orthonormal basis for

**E**. So, the spectral decomposition of A is:

_{2}**The decomposition is not unique**. There are infinitely possible bases for **E _{2}**. For instance, we can take two independent vectors in

**E**such as and . Note that is also a basis of

_{2 }**E**. With the Gram-Schmidt process, we can have another orthonormal basis. To demonstrate this, normalize to produce a basis vector with norm 1, i.e. . The projection of the vector on the vector space spanned by is:

_{2}The component of perpendicular to the projection is:

Normalizing this vector results in . This vector and span the eigenspace **E _{2}**, so is another orthonormal basis of

**E**. So, the spectral decomposition for A is:

_{2}