Complexity of Algorithms

Table of contents
  1. Commonly used complexity classes
  2. Asymptotic notation
    1. Big-O notation
    2. Big-Omega notation
    3. Big-Theta notation
  3. Master theorem
    1. Divide-and-conquer recurrence relations
      1. Simple form
      2. Advanced form
    2. Decreasing recurrence relations
  4. Amortized analysis
  5. Examples

Commonly used complexity classes

  • 1: constant
  • loglog(n)
  • log(n)
  • log(n): logarithmic
  • n: square root
  • n: linear
  • log(n!)
  • nlog(n): loglinear
  • nc: polynomial
  • cn: exponential
  • n!: factorial
Commonly used definitions in complexity analysis

Properties of logarithms

log(xy)=ylog(x) alogb(x)=xlogb(a) log(xy)=log(x)+log(y) log(xy)=log(x)log(y) logb(x)=loga(x)loga(b) k=1nlog(k)nlog(n)

Arithmetic series

k=1nk=1+2++n=n(n+1)2 k=abk=(ba+1)b+a2

Geometric series

k=0nark=a+ar+ar2++arn=a(1rn+1)1r

Harmonic series

k=1n1k=1+12++1nlog(n)

Faulhaber’s formula

Special case of power sum.

k=1nkp=1p+2p++npnp+1p+1

Asymptotic notation

Big-O notation

Big-O O is for the upper bound (worst case).

Not necessarily tight.

For some function f and g,

n0,c>0 s.t. nn0,f(n)cg(n)f(n)=O(g(n))

  • Transitivity: f(n)=O(g(n))g(n)=O(h(n))f(n)=O(h(n))
  • Reflexivity: f(n)=O(f(n))

Big-Omega notation

Big-Omega Ω is for the lower bound (best case).

Not necessarily tight.

For some function f and g,

n0,c>0 s.t. nn0,f(n)cg(n)f(n)=Ω(g(n))

  • Transitivity
  • Reflexivity
  • Transpose symmetry: f(n)=Ω(g(n))g(n)=O(f(n))

Big-Theta notation

Big-Theta Θ is for the tight bound (average case).

For some function f and g,

n0,c1,c2>0 s.t. nn0,c1g(n)f(n)c2g(n)f(n)=Θ(g(n))

  • Transitivity
  • Reflexivity
  • Symmetry: f(n)=Θ(g(n))g(n)=Θ(f(n))

Master theorem

Divide-and-conquer recurrence relations

For recursive, divide-and-conquer algorithms, the master theorem can be used to determine the asymptotic complexity.

Simple form

If the algorithm has the following recurrence relation,

T(n)=aT(nb)+O(nd)

where a>0, b>1, and d0, you may use the following rules of Master theorem.

  • If a<bd (dominated by the work at root),

    T(n)=O(nd)

  • If a=bd (work is evenly distributed in each layer),

    T(n)=O(ndlog(n))

  • If a>bd (dominated by the work at leaves),

    T(n)=O(nlogb(a))

Advanced form

Advanced form of Master theorem

If the algorithm has the following recurrence relation,

T(n)=aT(nb)+Θ(nklogp(n))

where a1, b>1, k0, and pR, you may use the following rules of Master theorem.

  • If a>bk,

    T(n)=Θ(nlogb(a))

  • If a=bk,
    • If p>1,

      T(n)=Θ(nlogb(a)logp+1(n))

    • If p=1,

      T(n)=Θ(nlogb(a)loglog(n))

    • If p<1,

      T(n)=Θ(nlogb(a))

  • If a<bk,
    • If p0,

      T(n)=Θ(nklogp(n))

    • If p<0,

      T(n)=O(nk)

Decreasing recurrence relations

If the algorithm has the following recurrence relation,

T(n)=aT(nb)+O(nd)

where a>0, b>0, you may use the following rules of Master theorem.

  • If a<1 (dominated by the work at root),

    T(n)=O(nd)

  • If a=1 (work is evenly distributed in each layer),

    T(n)=O(nd+1)

  • If a>1 (dominated by the work at leaves),

    T(n)=O(ndan/b)


Amortized analysis

Amortized analysis is a method for analyzing a sequence of operations. It is often used to show that a sequence of operations is efficient.

Even if a single operation is expensive, the amortized cost of each operation can be small if the expensive operation is performed rarely.

Amortized analysis is used when the worst-case asymptotic analysis over a sequence of operations is unfairly pessimistic.


Examples

Go to page