## Rank of a matrix

Notation: for a chosen basis and the corresponding coordinates.

Let (the results also holds for ). Then, the **column rank**/**row rank** of A is defined as the dimension of the column/row space of A, i.e. the dimension of the vector space spanned by the columns/rows of A; this is then equivalent to the number of linearly independent columns/rows (column/rows vector) of A.

**Theorem:** column rank of A = row rank of A.

**Definition:** the rank of a matrix, **rank(****A****)**, is the dimension of either the column or row space of A; simply the number of linearly independent columns or rows of A.

**Definition:** for a linear map , the rank of the linear map is defined as the dimension of the image of . This definition is equivalent to the definition of the matrix rank as every linear map has a matrix by which it can be written as .

**Proposition**: . This leads to these definitions: A matrix is said to be **full rank** iff , i.e. the largest possible rank, and it is said to be **rank deficient** iff , i.e. not having full rank.

### Properties of rank

For :

**1-** only a zero matrix has rank zero.

**2-** If , then

**3-**

**4-**

**5-**

**6-** If , then for , . In addition, for , , i.e. has at most rank *n*.

**7-** A square matrix can be decomposed as where is a diagonal matrix containing the eigenvalues of . Then, , i.e. the number of non-zero eigenvalues of .

**8-** For a square matrix , then equivalently is full rank, is invertible, has non-zero determinant, and has *n* non-zero eigenvalues.

## Proofs

**P6:**

V=\begin{bmatrix}v^1\begin{bmatrix}v^1\\v^2\\ \vdots\\v^r \end{bmatrix}&v^2\begin{bmatrix}v^1\\v^2\\ \vdots\\v^r \end{bmatrix}&\dots &v^r\begin{bmatrix}v^1\\v^2\\ \vdots\\v^r \end{bmatrix}\end{bmatrix}

where are coordinates of the vector . This indicates that each column of is a scalar multiple of any other columns of ; therefore, the column space is one dimensional. Hence, .

For the second part, property 3 proves the statement.

**The Diagonalization problem**

**The Diagonalization problem:** given a square matrix , we want to find an invertible (non-singular) matrix for which is a diagonal matrix.

**Theorem:** is diagonalizable if and only if it has linearly independent eigenvectors. In other words, the its eigenvectors span (are a basis for) . Another way is to say is diagonalizble if and only if the dimension of its eigenspace equals (i.e. the sum of of the dimensions of the eigenspaces corresponding to the eigenvalues). This on the other hand means that the sum of the geometric multiplicities of its eigenvalues must be . Equivalently we can state that, is diagonalizable if and only if the geometric multiplicity of each eigenvalue of is the same as its algebraic multiplicity (because the geometric multiplicity of an eigenvalue is less than or equal to its algebraic multiplicity).

**Proposition: **If with independent eigenvectors then, where is diagonal and consists of the eigenvalues of . Note that is full rank, i.e. its columns space is dimensional, therefore it is not singular i.e it is invertible.

**Theorem [linear independence of eigenvectors]:** for a square matrix , eigenvectors corresponding to distinct eigenvalues are linearly independent.

**Theorem [eigenvalues and diagonalizability]:** with distinct eigenvalues is diagonalizable. Note that the converse is not necessarily true. The sum of geometric multiplicities is the key.

**Definition:** A square matrix is said to be orthogonally diagonalizable if there is an orthogonal matrix for which is diagonal. An orthogonal matrix is a matrix whose columns, and also rows as the results, are an orthonormal basis of .

**Theorem [orthogonally diagonalizable]:** is orthogonally diagonalizable if and only if there exists an orthonormal set of eigenvectors of . Then, where columns of are the orthonormal eigenvectors spanning .

**Theorem [diagonalization of symmetric matrices]:** A square matrix is orthogonally diagonalizable if and only if it is symmetric. Moreover, if is a symmetric matrix, then the eigenvectors from different eigenspaces are orthogonal.

To diagonalize a symmetric matrix 1)find the eigenvectors. 2) find basis (normalized) for each eigenspace 3) since eigenvectors of different eigenspaces (corresponding to different eigenvalues) are orthogonal, orthonormal basis vectors can be constructed using Gram-Schmidt process. Construct the matrices and using the orthonormal eigenvectors, and is the matrix of the eigenvalues diagonally with the same order of the eigenvectors sorted in the columns of .