Model Evaluation
Confusion Matrix Summary
True Positive: 6 blue points
True Negative: 5 red points
False Positive (Type 1 Error): 2 red points
False Negative (Type 2 Error): 1 blue point
Accuracy
accuracy = (True Positives + True Negatives) / Total Points
Accuracy is not always perfect for model evaluation, especially for imbalanced data sets.
For example, if we have a data set, where 99% of the data is positive, 1% of the data is negative. We can simply have a model which always predict positive, then the model accuracy is 99%, while we are not catching any of the negative data.
Precision
precision= True Positives / (True Positives + False Positives)
Precision focuses on False Positive errors.
Recall
recall= True Positives / (True Positives + False Negatives)
Recall focuses on False Negative errors.
F1 Score
F1 Score= 2 * Precision * Recall / (Precision + Recall)
F1 Score is a harmonic…