# Spectral (Eigenvalue) decomposition (EVD)

A matrix is diagonalizable if and only if it has independent eigenvectors. Therefore, where the columns of are the eigenvectors of and is a diagonal matrix of the corresponding (to the eigenvectors) eigenvalues s. Consequently, the square matrix can be written as the following sum.

where is the i-th eigen pair, is the i-th row of . Here, we only considered cases with real eigenvalues.

If is a symmetric matrix, always having real eigenvalues and being diagonalizable, we can write where contains columns of orthonormal eigenvectors of . Therefore,

This representation is called the **spectral decomposition** or **eigenvalue decomposition** **(EVD)** of . As it can be observed the decomposition expresses as a linear combination of rank 1 matrices in which the coefficients of the combination are the eigenvalues of Note that is not unique.

# Singular value decomposition (SVD)

**Theorem [SVD1]:** If of rank , then can be factored as

where and are orthogonal matrices and is an diagonal matrix whose main diagonal has positive entries and zeros.

Proof:

We can prove that the matrix is symmetric. So, it has an EVD as

where the columns of contains orthonormal eigenvectors of , and contains eigenvalues of and they are all non-negative. Because if is an eigen pair of

Therefore, entries of , being eigenvalues of , are all non-negative. Note that cannot be the zero vector as it is an eigenvector of , but can lead to the zero vector, therefore we used .

As by a rank theorem , and by the assumption , therefore, there are positive entries and zeros on the main diagonal of .

Let’s consider the sets of eigen pairs of as

where s are orthonormal (they exist as is symmetric), i.e. and .

Considering the set of image eigenvectors we can write, . Therefore, s are orthogonal.

Also, . Therefore, for .

Consequently, for , the set is an orthogonal set of non-zero vectors in the column space of (because for any vector and matrix , the term is , i.e. a linear combination of the columns of ). Because , the column space of of has dimension of , and hence is an orthogonal basis for the column space of . Normalizing these vectors as , , we can find an extended set of orthonormal basis vectors for as,

Note that if because and for those s. Therefore, we needed to extend to a set of orthonormal vectors spanning .

Now defining a matrix as ( is in a column vector form)

and a diagonal matrix as

where the diagonal entries are zero for , we can write

where the columns of collect s. So, since is orthogonal, then

Note that the entries on the main diagonal of are NOT the eigenvalues of but rather the square roots of the eigenvalues (being non-negative) of . These values are called the **singular values** of though. Also, s and s are respectively called the left and the right **singular vectors** of .

**Remark:** Singular values of a square matrix is the square roots of the eigenvalues of . Singular vectors of is the eigenvectors of

**Theorem [singular value decomposition of square matrices]:** A square matrix of rank as a singular value decomposition as

where are orthonormal eigenvectors of , associated with the eigenpair . The column vectors of are ordered according to . Also, for , and is such that it extends to an orthonormal basis for .

**Remark:** and are not unique. If we use *THE* *SVD* of , we refer to a specific decomposition not the uniqueness of the decomposition.

### SVD of invertible matrices

If is invertible then and hence all the elements on the diagonal of are non-zero, and no need to do extension on s as there are already independent orthonormal vectors.

### SVD of symmetric matrices

If is symmetric, then and hence . Therefore, if , then has non-zero eigenvalues. If is an eigenpair of , then is an eigenpair of , because,

Hence, in the SVD of the symmetric matrix :

1- singular values of are .

2- left singular vectors of are as s being the normalized eigenvectors of .

3- right singular vectors of are as for

Consequently, the SVD and EVD of a symmetric matrix is the same if the eigenvalues of the matrix are non-negative, otherwise the decompositions are up to the sign of a left (or a right) singular vectors.

### Reduced SVD

The zero rows of a SVD are redundant and can be removed, therefore we can write the following **reduced SVD,**

Also the corresponding **reduced SVD expansion** of is,

### Reduced rank SVD

Considering a reduced SVD expansion of with rank , suppose that the singular values are sufficiently small (relative to ) that dropping them would lead to an approximation of with an intended error. Therefore, we can define a reduced rank approximation or rank approximation of as,

It can be proved that .

In numerical computations, the reduced expansion of needs much less space to be saved in the memory compared to its full rank SVD expansion.