A matrix table used to describe the performance of a classification model (or "classifier") on a set of test data for which the true values are known.
There are four classes in Confusion Matrix:
- True Positives - Class used to describe a state when True value and Predicted values are high [1 == 1]
- False Positives - Class used to describe a state when True value is low and Predicted value is high [0 != 1]
- True Negatives - Class used to describe a state when True value is high and Predicted value is low [1 != 0]
- False Negatives - Class used to describe a state when True value and Predicted values are low [0 == 0]
True Values = [1, 1, 0, 1, 1, 1, 0, 0, 1, 0]
Predicted Values = [1, 0, 0, 1, 0, 1, 1, 1, 1, 0]
True/Pred | 0 | 1 |
---|---|---|
0 | 2 | 2 |
1 | 2 | 4 |
From the above matrix we can derive the following statistics:
Accuracy of the Model:
Formula: (TP+TN)/total = (4+2) / 10 = 60%
How often the model is wrong.
Formula: (FP+FN)/total = (2+2) / 10 = 40%
Rate of True Positive - Rate of prediction of high when true values are high?
Formula: TP/totan(True(1)) = 4 / 6 = 66.6%
Rate of False Negative - Rate of prediction of low when true values are low
Formula: TN/total(True(0)) = 2 / 4 = 50%
Correct Prediction of True Values
Formula: TP/total(Predicted(1)) = 4 / 6 = 66.6%