It is the mark of a truly intelligent person to be moved by Statistics by George Bernard Shaw, Irish playwright and political activist
Every term of Confusion matrix
Let's understand through a confusion matrix of most famous dataset: Loan defaulter. Through this confusion matrix, we will try to calculate every possible metric from it.
| Confusion matrix |
Non defaulter means customer is paying loan timely
Defaulter means customer is not paying timely to the bank, having dues.
True Positive (TP): We predicted 20 cases will be defaulter and turns out to be true.
True Negative (TN): We predicted 121 cases are non defaulter and it turns out to be true.
False Positive (FP): We predicted 22 cases are defaulter but turns out to be false.
False Negative (FN): We predicted 7 cases are non defaulter but turns out to be false.
Accuracy: How often is the classifier correct.
(TP+TN)/(TP+TN+FP+FN) = 82.94%
Sensitivity/Recall/True Positive rate: When it's actually Defaulter, how often does it predict yes.
TP/(TP+FN) = 74.07%
Precision: The precision metric shows the accuracy of the positive class. It measures how likely the prediction of the positive class is correct.
TP/(TP+FP) = 47.61%
Specificity/ True Negative rate: When it's actually non defaulter, how often does it predict the same.
TN/(TN+FP) = 84.61%
F-measure: It is the representation of both Precision and Recall metric. It uses harmonic mean and it's value always close to the smaller value of either precision or recall.
(2*Recall*Precision)/(Recall+Precision) = 57.96%
Feel free to comment your opinionđź’“
(TP+TN)/(TP+TN+FP+FN) = 82.94%
Sensitivity/Recall/True Positive rate: When it's actually Defaulter, how often does it predict yes.
TP/(TP+FN) = 74.07%
Precision: The precision metric shows the accuracy of the positive class. It measures how likely the prediction of the positive class is correct.
TP/(TP+FP) = 47.61%
Specificity/ True Negative rate: When it's actually non defaulter, how often does it predict the same.
TN/(TN+FP) = 84.61%
F-measure: It is the representation of both Precision and Recall metric. It uses harmonic mean and it's value always close to the smaller value of either precision or recall.
(2*Recall*Precision)/(Recall+Precision) = 57.96%
Feel free to comment your opinionđź’“

0 Comments