Loading...
Loading...
Calculate accuracy, precision, recall, F1-score, MCC and other ML classification metrics from your confusion matrix.
Enter confusion matrix values and click Calculate.
Quick Reference:
Use precision when false positives are costly (spam filter, recommendation). Use recall when false negatives are costly (disease detection, safety systems). F1 score balances both.
Matthews Correlation Coefficient (MCC) ranges from -1 to +1, where +1 is perfect, 0 is random, -1 is inverse. Use MCC when classes are imbalanced-it's more informative than accuracy.
With 99% negative class, a model predicting "always negative" gets 99% accuracy but 0% recall for positives. Balanced accuracy or MCC give better insight.
TP (top-left): Correctly identified positives. TN (bottom-right): Correctly identified negatives. FP (top-right): Type I error-falsely flagged as positive. FN (bottom-left): Type II error-missed positive.