Sum of Squares
Table of contents
What is Sum of Squares?
Sum of squares is a concept frequently used in regression analysis.
Depending on what we choose to square, we end up with many different sums of squares.
Total Sum of Squares
The total sum of squares (
where
Graphically, in a simple linear regression with one independent variable,

Residual Sum of Squares
Also known as sum of squared errors (SSE).
The residual sum of squares (
Graphically, in a simple linear regression with one independent variable,

Explained Sum of Squares
Also known as model sum of squares.
The explained sum of squares (
Graphically, in a simple linear regression with one independent variable,

Relationship Between Sum of Squares
For linear regression models using Ordinary Least Squares (OLS) estimation, the following relationship holds:
Ordinary Least Squares
Least squares is a common estimation method for linear regression models.
The idea is to fit a model that mimimizes some sum of squares (i.e. creates the least squares).
Ordinary least squares (OLS) minimizes the residual sum of squares.
Since we are trying to minimize the sum of squares with respect to the parameters, we solve for the partial derivatives. In simple linear regression, for example:
Given that some conditions hold, there is a closed-form estimation for
If
Properties of OLS
Consistent
OLS estimators are consistent:
Asymptotically Normal
OLS estimators are asymptotically normal:
Hence you could find normal confidence intervals for