Confusion Matrix
Table of contents
What is a confusion matrix?
Actual\Prediction | Positive | Negative |
---|---|---|
Positive | True Positive | False Negative |
Negative | False Positive | True Negative |
It is a performance measure for machine learning classification.
It is also known as the error matrix.
As you can see from the table, because the test result is compared against a condition label, it is usually used for measuring the performance of a supervised learning.
Terminology
Condition Positive (P)
Actually positive
Condition Negative (N)
Actually negative
True Positive (TP)
Actually positive & Tested positive
True Negative (TN)
Actually negative & Tested negative
False Positive (FP)
Actually negative & Tested positive
False Negative (FN)
Actually positive & Tested negative
For TP, TN, FP, and FN, the last positive/negative indicates the result of the test.
The True or False are affirmation or rejection of the test result.
Memorize FP as “actually not positive” and FN as “actually not negative”.
Types of Errors
The non-errors are True Positive (TP) and True Negative (TN).
Type I Error
False Positive (FP) is a Type I error.
Type II Error
False Negative (FN) is a Type II error.
Confusion metrics
Listing some of the basic ones,
True Positive Rate (TPR)
Also called: sensitivity, recall, hit rate
\[TPR = \frac{TP}{P} = \frac{TP}{TP+FN} = 1 - FNR\]True Negative Rate (TNR)
Also called: specificity, selectivity
\[TNR = \frac{TN}{N} = \frac{TN}{TN+FP} = 1 - FPR\]Positive Predictive Value (PPV)
Also called: precision
\[PPV = \frac{TP}{TP+FP}\]Accuracy (ACC)
\[ACC = \frac{TP+TN}{P+N} = \frac{TP+TN}{TP+FN+TN+FP}\]False Negative Rate (FNR)
Also called: miss rate
\[FNR = \frac{FN}{P} = \frac{FN}{FN+TP} = 1 - TPR\]False Positive Rate (FPR)
Also called: fall-out
\[FPR = \frac{FP}{N} = \frac{FP}{FP+TN} = 1 - TNR\]