Covariance

Table of contents
  1. Understanding covariance
  2. Sample Covariance
  3. Independence and Covariance
  4. Covariance Matrix

Understanding covariance

Covariance is a measure of how two variables vary together.

It is defined as follows:

$$ \begin{align*} \Cov[X, Y] &= \E[(X - \E[X])(Y - \E[Y])] \\[1em] &= \E[XY] - \E[X]\E[Y] \end{align*} $$

Covariance trends

As you can see from first line of the definition, covariance is determined by the product of the deviations of $X$ and $Y$ from their respective means.

If they are both above the mean (increasing trend) or both below the mean (decreasing trend), then the product will be positive, resulting in a positive covariance.

If they go opposite ways from their means, then the product will be negative, resulting in a negative covariance.


Sample Covariance

$$ \text{cov}_{x,y} = \frac{\sum (x_i - \bar{x})(y_i - \bar{y})}{n-1} $$


Independence and Covariance

When $X$ and $Y$ are independent, their covariance is zero:

$$ \Cov[X, Y] = \E[XY] - \E[X]\E[Y] = \E[X]\E[Y] - \E[X]\E[Y] = 0 $$


Covariance Matrix

Let $X = (X_1, \dots, X_n)$ be a random vector.

Then the covariance matrix of $X$ is a square matrix $K_{XX}$ with entries:

$$ K_{XX}(i, j) = \Cov[X_i, X_j] = \E[(X_i - \E[X_i])(X_j - \E[X_j])] $$

Which turns out to be a matrix of the form:

\[K_{XX} = \begin{bmatrix} \Var[X_1] & \Cov[X_1, X_2] & \dots & \Cov[X_1, X_n] \\ \Cov[X_2, X_1] & \Var[X_2] & \dots & \Cov[X_2, X_n] \\ \vdots & \vdots & \ddots & \vdots \\ \Cov[X_n, X_1] & \Cov[X_n, X_2] & \dots & \Var[X_n] \end{bmatrix}\]