Properties of Expectation, Variance, and Covariance

This is a quick summary of the properties of expectation, variance, and covariance.

For details:

Table of contents
  1. Expectation
  2. Conditional Expectation
  3. Variance
  4. Conditional Variance
  5. Covariance

Expectation

$$ \begin{equation} \tag{Expectated value of a constant} \E[c] = c \end{equation} $$

For the same reason \[\E[\E[X]] = \E[X]\]

$$ \begin{equation} \tag{Linearity of expectation} \E[aX + bY] = a\E[X] + b\E[Y] \end{equation} $$ $$ \begin{equation} \tag{Expectation of joint RVs} \E[XY] = \E[X]\E[Y] + \Cov(X, Y) \end{equation} $$

Independence

When $X$ and $Y$ are independent,

\[\begin{gather*} \Cov(X, Y) = 0 \\[1em] \E[XY] = \E[X]\E[Y] \end{gather*}\]

Conditional Expectation

$$ \begin{equation} \tag{Law of total expectation} \E[X] = \E[\E[X|Y]] \end{equation} $$

Conditional Expectation as RV

$\E[X|Y]$ is a random variable of $Y$.

Hence $\E[\E[X|Y]]$ is aggregating over all possible values of $Y$, that’s why we are left with $\E[X]$.

$$ \begin{gather*} \E[a | X] = a \\[0.5em] \E[aX + bY | Z] = a\E[X|Z] + b\E[Y|Z] \\[0.5em] \end{gather*} $$

The above are linearity.

$$ \begin{gather*} \E[X | X] = X \\[0.5em] \E[g(X) | X] = g(X) \\[0.5em] \end{gather*} $$

The above are obvious. If you have $X$, you’re expected to get $X$.

$$ \E[X | Y, g(Y)] = \E[X | Y] \\[0.5em] $$

The above is also obvious: if you already know $Y$, knowing $g(Y)$ doesn’t change anything.

$$ \begin{equation*} \E[X g(Y) | Y] = g(Y)\E[X|Y] \end{equation*} $$ $$ \begin{equation*} \E[\E[X|Y, Z]| Y] = \E[X|Y] \end{equation*} $$

These need a bit more thought.


Variance

$$ \begin{align*} \Var(X) &= \E[(X - \E[X])^2] \\[0.5em] &= \E[X^2] - \E[X]^2 \end{align*} $$

Derivation

Just remember that $\E[\E[X]] = \E[X]$ and $\E[\E[X]^2] = \E[X]^2$, because $\E[X]$ is a constant.

\[\begin{align*} \Var(X) &= \E[(X - \E[X])^2] \\[0.5em] &= \E[X^2 - 2X\E[X] + \E[X]^2] \\[0.5em] &= \E[X^2] - 2\E[X]\E[X] + \E[X]^2 \\[0.5em] &= \E[X^2] - \E[X]^2 \end{align*}\]

$$ \begin{gather*} \Var(aX + b) = a^2\Var(X) \\[0.5em] \Var(aX + bY) = a^2\Var(X) + b^2\Var(Y) + 2ab\Cov(X, Y) \\[0.5em] \Var(aX - bY) = a^2\Var(X) + b^2\Var(Y) - 2ab\Cov(X, Y) \end{gather*} $$


Conditional Variance

$$ \begin{align*} \Var(X|Y) &= \E[(X - \E[X|Y])^2|Y] \\[0.5em] &= \E[X^2|Y] - \E[X|Y]^2 \end{align*} $$

Derivation \[\begin{align*} \Var(X|Y) &= \E[(X - \E[X|Y])^2|Y] \\[0.5em] &= \E[X^2 - 2X\E[X|Y] + \E[X|Y]^2|Y] \\[0.5em] &= \E[X^2|Y] - 2\E[X\E[X|Y]|Y] + \E[\E[X|Y]^2|Y] \tag{$\ast$} \\[0.5em] &= \E[X^2|Y] - 2\E[X|Y]\E[X|Y] + \E[X|Y]^2 \\[0.5em] &= \E[X^2|Y] - \E[X|Y]^2 \end{align*}\]

In $(\ast)$, remember that $\E[X|Y]$ is a RV of $Y$. See above:

\[\begin{gather*} \E[Xg(Y)|Y] = g(Y)\E[X|Y] \\[0.5em] \E[g(Y)|Y] = g(Y) \end{gather*}\]

$$ \begin{equation} \tag{Law of total variance} \Var(X) = \E[\Var(X|Y)] + \Var(\E[X|Y]) \end{equation} $$

Derivation \[\begin{align*} \E[\Var(X|Y)] &= \E[\E[X^2|Y] - \E[X|Y]^2] \\[0.5em] &= \E[\E[X^2|Y]] - \E[\E[X|Y]^2] \\[0.5em] &= \E[X^2] - \E[\E[X|Y]^2] \\[0.5em] \end{align*}\]

and,

\[\begin{align*} \Var(\E[X|Y]) &= \E[\E[X|Y]^2] - \E[\E[X|Y]]^2 \\[0.5em] &= \E[\E[X|Y]^2] - \E[X]^2 \end{align*}\]

Then,

\[\E[\Var(X|Y)] + \Var(\E[X|Y]) = \E[X^2] - \E[X]^2 = \Var(X)\]

Covariance

$$ \begin{align*} \Cov(X, Y) &= \E[(X - \E[X])(Y - \E[Y])] \\[0.5em] &= \E[XY] - \E[X]\E[Y] \end{align*} $$