Determinants

Table of contents
  1. Determinant
    1. Singular Matrix
    2. Properties of the Determinant
  2. Laplace Expansion
    1. Singleton Matrix Case
    2. General Case
      1. Expansion Along a Row
      2. Expansion Along a Column
    3. 2x2 Matrix Application
  3. Rule of Sarrus
  4. Determinant of a Triangular Matrix
  5. Leibniz Formula
  6. Geometrical Interpretation

Determinant

Determinant of a $n \times n$ square matrix $\boldsymbol{A}$ is denoted:

$$ \det(\boldsymbol{A}) \quad \text{or} \quad |\boldsymbol{A}| $$

Only defined for square matrices.

Singular Matrix

The following is an important property of the determinant.

For a square matrix $\boldsymbol{A}$,

$$ \det(\boldsymbol{A}) = 0 \quad \iff \quad \boldsymbol{A} \text{ is singular} \quad \iff \nexists \boldsymbol{A}^{-1} $$

Thus a singular matrix is not invertible, and the system of linear equations $\boldsymbol{A}\boldsymbol{x} = \boldsymbol{b}$ has no unique solution.

Properties of the Determinant

Let $\boldsymbol{A}$, $\boldsymbol{B}$, $\boldsymbol{C}$ be square matrices of same size and $k$ a scalar:

  • $\det(k\boldsymbol{A}) = k^n \det(\boldsymbol{A})$
    • Multiplying a single row or column by $k$ multiplies the determinant by $k$.
    • Multiplying all rows or columns by $k$ multiplies the determinant by $k^n$.
  • If $\boldsymbol{A}$, $\boldsymbol{B}$, $\boldsymbol{C}$ only differ by a single (same) row or column $i$, and the different row or column of $\boldsymbol{C}_i = \boldsymbol{A}_i + \boldsymbol{B}_i$, then $\det(\boldsymbol{C}) = \det(\boldsymbol{A}) + \det(\boldsymbol{B})$.
  • If $\boldsymbol{A}$ is obtained by swapping two rows or columns of $B$, then $\det(\boldsymbol{A}) = -\det(\boldsymbol{B})$.
  • If $\boldsymbol{A}$ has two identical rows or columns, then $\det(\boldsymbol{A}) = 0$.
  • $\det(\boldsymbol{A}) \neq 0$ if and only if $\boldsymbol{A}$ has full rank.
  • Adding a multiple of one row or column to another row or column does not change the determinant.
  • $\det(\boldsymbol{A}) = \det(\boldsymbol{A}^T)$
  • $\det(I) = 1$
  • If any row or column of $\boldsymbol{A}$ is all zeros, then $\det(\boldsymbol{A}) = 0$.
  • Multiplicativity of determinants: $\det(\boldsymbol{A}\boldsymbol{B}) = \det(\boldsymbol{A}) \det(\boldsymbol{B})$
  • $\det(\boldsymbol{A}^{-1}) = \frac{1}{\det(\boldsymbol{A})}$
  • $\det(\operatorname{adj}(\boldsymbol{A})) = \det(\boldsymbol{A})^{n-1}$
  • If $\boldsymbol{A}$ is an orthogonal matrix, $|\det(\boldsymbol{A})| = 1$

Laplace Expansion

You need to understand minor (determinants) and cofactor before continuing.

Laplace expansion is a recursive method to calculate the determinant of a matrix.

Singleton Matrix Case

For a $1 \times 1$ matrix $\boldsymbol{A} = [a]$, the determinant is defined as simply:

$$ a $$

General Case

With the base case in mind, we can recursively calculate the determinant of a matrix by expanding along a row or a column.

Expansion Along a Row

For a $n \times n$ matrix $\boldsymbol{A}$, we can fix a row $i$ and expand along that row.

Usually, we expand along the first row $i = 1$.

The determinant of $\boldsymbol{A}$ is the sum of the products of the elements on the row $i$ and their cofactors.

$$ \det(\boldsymbol{A}) = \sum_{j=1}^n a_{ij} \cdot (-1)^{i+j} \boldsymbol{M}_{ij} $$

Expansion along the row is more common than expansion along the column.

Expansion Along a Column

Same idea, we can fix a column $j$ and expand along that column.

The determinant of $\boldsymbol{A}$ is the sum of the products of the elements on the column $j$ and their cofactors.

\[\det(\boldsymbol{A}) = \sum_{i=1}^n a_{ij} \cdot (-1)^{i+j} M_{ij}\]

2x2 Matrix Application

Let’s apply the Laplace expansion to a $2 \times 2$ matrix $\boldsymbol{A}$.

\[\boldsymbol{A} = \begin{bmatrix} a & b \\ c & d \end{bmatrix}\]

Expansion along the first row $i = 1$.

\[\begin{align*} \det(\boldsymbol{A}) &= \sum_{j=1}^2 a_{1j} \cdot (-1)^{1+j} M_{1j} \\ &= a \cdot (-1)^{1+1} M_{11} + b \cdot (-1)^{1+2} M_{12} \\ &= a \cdot M_{11} - b \cdot M_{12} \\ &= a \cdot d - b \cdot c \end{align*}\]

The determinant of a $2 \times 2$ matrix $\boldsymbol{A}$ is thus defined as:

$$ \det(\boldsymbol{A}) = ad - bc $$


Rule of Sarrus

This is a quick method for the determinant of a $3 \times 3$ matrix.

Let $\boldsymbol{A}$ be a $3 \times 3$ matrix:

$$ \det(\boldsymbol{A}) = \begin{vmatrix} a & b & c \\ d & e & f \\ g & h & i \end{vmatrix} = a(ei - fh) - b(di - fg) + c(dh - eg) $$


Determinant of a Triangular Matrix

See triangular matrices.

For a triangular matrix $\boldsymbol{T}$,

$$ \det(\boldsymbol{T}) = \prod_{i=1}^n t_{ii} $$

i.e. the determinant is the product of the diagonal elements.


Leibniz Formula

Leibniz formula is another way to calculate the determinant of a square matrix.

It comes from the observation that the full determinant expansion can be written as a special sum of products of elements.

Explanation assumes expansion along row $i$ for simplicity.

The key takeaway is that each recursive product in the sum ends up looking like the following:

\[\prod_{i=1}^n a_{i\sigma(i)}\]

That is, while the row $i$ increases from $1$ to $n$ (or is an identity permutation), the column is actually permuted by some $\sigma$.

Like $a_{12} \cdot a_{23} \cdot a_{31}$, where the column $j$ is permuted to $2, 3, 1$.

In addition, the sign of the recursive product ends up equal to the parity of the permutation $\sigma$.

Therefore the determinant of $\boldsymbol{A}$ can be written as:

$$ \det(\boldsymbol{A}) = \sum_{\sigma \in S_n} \text{sgn}(\sigma) \prod_{i=1}^n a_{i\sigma(i)} $$

Or equivalently,

\[\det(\boldsymbol{A}) = \sum_{\sigma \in S_n} \text{sgn}(\sigma) \prod_{i=1}^n a_{\sigma(i)i}\]

Geometrical Interpretation

In 2D, the determinant of a $2 \times 2$ matrix can be interpreted as the signed area of the parallelogram formed by the two column vectors.

In 3D, the determinant of a $3 \times 3$ matrix can be interpreted as the signed volume of the parallelepiped formed by the three column vectors.

A parallelepiped is a 3D generalization of a parallelogram (i.e. a linearly transformed cube).

If the columns are linearly dependent, the area or volume becomes zero.