Performance Evaluation Metrics
Python 3
Theory
Confusion Matrix
Actual value | |||
Positive | Negative | ||
Predicted | Positive | True Positive | False Positive |
Value | Negative | False Negative | True Negative |
Accuracy

Precision

Precision is used when the cost of a false positive is higher than the cost of a false negative.
Recall

Recall is used when the cost of a false negative is higher than the cost false positive.
F1-Score

The F1-Score is an optimal blend of precision and recall.
Reference
https://scikit-learn.org/stable/modules/generated/sklearn.metrics.confusion_matrix.html
https://www.kdnuggets.com/2020/04/performance-evaluation-metrics-classification.html