Complexity of Algorithms

Table of contents
  1. Commonly used complexity classes
  2. Asymptotic notation
    1. Big-O notation
    2. Big-Omega notation
    3. Big-Theta notation
  3. Master theorem
    1. Divide-and-conquer recurrence relations
      1. Simple form
      2. Advanced form
    2. Decreasing recurrence relations
  4. Amortized analysis
  5. Examples

Commonly used complexity classes

  • $1$: constant
  • $\log\log(n)$
  • $\sqrt{\log(n)}$
  • $log(n)$: logarithmic
  • $\sqrt{n}$: square root
  • $n$: linear
  • $\log(n!)$
  • $n\log(n)$: loglinear
  • $n^c$: polynomial
  • $c^n$: exponential
  • $n!$: factorial
Commonly used definitions in complexity analysis

Properties of logarithms

\[\log(x^y) = y\log(x)\] \[a^{\log_b(x)} = x^{\log_b(a)}\] \[\log(xy) = \log(x) + \log(y)\] \[\log(\frac{x}{y}) = \log(x) - \log(y)\] \[\log_b(x) = \frac{\log_a(x)}{\log_a(b)}\] \[\sum_{k=1}^n \log(k) \approx n\log(n)\]

Arithmetic series

\[\sum_{k=1}^n k = 1 + 2 + \cdots + n = \frac{n(n+1)}{2}\] \[\sum_{k=a}^b k = (b-a+1)\frac{b+a}{2}\]

Geometric series

\[\sum_{k=0}^n ar^k = a + ar + ar^2 + \cdots + ar^n = \frac{a(1-r^{n+1})}{1-r}\]

Harmonic series

\[\sum_{k=1}^n \frac{1}{k} = 1 + \frac{1}{2} + \cdots + \frac{1}{n} \approx \log(n)\]

Faulhaber’s formula

Special case of power sum.

\[\sum_{k=1}^n k^p = 1^p + 2^p + \cdots + n^p \approx \frac{n^{p+1}}{p+1}\]

Asymptotic notation

Big-O notation

Big-O $O$ is for the upper bound (worst case).

Not necessarily tight.

For some function $f$ and $g$,

$$ \exists n_0, c > 0 \text{ s.t. } \forall n \geq n_0,\; f(n) \leq c g(n) \implies f(n) = O(g(n)) $$

  • Transitivity: $f(n) = O(g(n)) \land g(n) = O(h(n)) \implies f(n) = O(h(n))$
  • Reflexivity: $f(n) = O(f(n))$

Big-Omega notation

Big-Omega $\Omega$ is for the lower bound (best case).

Not necessarily tight.

For some function $f$ and $g$,

$$ \exists n_0, c > 0 \text{ s.t. } \forall n \geq n_0,\; f(n) \geq c g(n) \implies f(n) = \Omega(g(n)) $$

  • Transitivity
  • Reflexivity
  • Transpose symmetry: $f(n) = \Omega(g(n)) \iff g(n) = O(f(n))$

Big-Theta notation

Big-Theta $\Theta$ is for the tight bound (average case).

For some function $f$ and $g$,

$$ \exists n_0, c_1, c_2 > 0 \text{ s.t. } \forall n \geq n_0,\; c_1 g(n) \leq f(n) \leq c_2 g(n) \implies f(n) = \Theta(g(n)) $$

  • Transitivity
  • Reflexivity
  • Symmetry: $f(n) = \Theta(g(n)) \iff g(n) = \Theta(f(n))$

Master theorem

Divide-and-conquer recurrence relations

For recursive, divide-and-conquer algorithms, the master theorem can be used to determine the asymptotic complexity.

Simple form

If the algorithm has the following recurrence relation,

$$ T(n) = aT(\frac{n}{b}) + O(n^d) $$

where $a > 0$, $b > 1$, and $d \geq 0$, you may use the following rules of Master theorem.

  • If $a < b^d$ (dominated by the work at root),

    $$ T(n) = O(n^d) $$

  • If $a = b^d$ (work is evenly distributed in each layer),

    $$ T(n) = O(n^d \log(n)) $$

  • If $a > b^d$ (dominated by the work at leaves),

    $$ T(n) = O(n^{\log_b(a)}) $$

Advanced form

Advanced form of Master theorem

If the algorithm has the following recurrence relation,

$$ T(n) = aT(\frac{n}{b}) + \Theta(n^k \log^p(n)) $$

where $a \geq 1$, $b > 1$, $k \geq 0$, and $p \in \mathbb{R}$, you may use the following rules of Master theorem.

  • If $a > b^k$,

    $$ T(n) = \Theta(n^{\log_b(a)}) $$

  • If $a = b^k$,
    • If $p > -1$,

      $$ T(n) = \Theta(n^{\log_b(a)} \log^{p+1}(n)) $$

    • If $p = -1$,

      $$ T(n) = \Theta(n^{\log_b(a)} \log\log(n)) $$

    • If $p < -1$,

      $$ T(n) = \Theta(n^{\log_b(a)}) $$

  • If $a < b^k$,
    • If $p \geq 0$,

      $$ T(n) = \Theta(n^k \log^p(n)) $$

    • If $p < 0$,

      $$ T(n) = O(n^k) $$

Decreasing recurrence relations

If the algorithm has the following recurrence relation,

$$ T(n) = aT(n-b) + O(n^d) $$

where $a > 0$, $b > 0$, you may use the following rules of Master theorem.

  • If $a < 1$ (dominated by the work at root),

    $$ T(n) = O(n^d) $$

  • If $a = 1$ (work is evenly distributed in each layer),

    $$ T(n) = O(n^{d+1}) $$

  • If $a > 1$ (dominated by the work at leaves),

    $$ T(n) = O(n^d a^{n/b}) $$


Amortized analysis

Amortized analysis is a method for analyzing a sequence of operations. It is often used to show that a sequence of operations is efficient.

Even if a single operation is expensive, the amortized cost of each operation can be small if the expensive operation is performed rarely.

Amortized analysis is used when the worst-case asymptotic analysis over a sequence of operations is unfairly pessimistic.


Examples

Go to page