a shot of dev knowledge

**Precision** tells us how many, out of all instances that were predicted to belong to class $X$, actually belonged to class $X$. The precision for class $X$ is calculated as:

$\frac{TP}{TP+FP}$

$TP$ = the number of true positives for class $X$.

$FP$ = the number of false positives for class $X$.

**Recall** expresses how many instances of class $X$ were predicted correctly. The recall is calculated as:

$\frac{TP}{TP+FN}$

$TP$ = the number of true positives for class $X$.

$FN$ = the number of false negatives for class $X$.

Suppose that we have the following confusion matrix:

A confusion matrix

Binary classification problems often focus on the positive class; therefore, precision and recall are calculated for the positive class.

**Precision**

The precision is equal to:

$\frac{13}{13+4} = 0.7647$

This shows that $76.47\%$ of the records that were classified as positive were *actually* positive.

**Recall**

The recall is equal to:

$\frac{13}{13+3} = 0.8125$

This shows that $81.25\%$ of the positive instances were classified as positive.

RELATED COURSES

View all Courses

Keep Exploring

Related Courses