cancel
Showing results for
Did you mean:
cancel
Showing results for
Did you mean:

Bedrock

## Precision and Recall with the Image Recognition Example

Precision and Recall are the evaluation matrices that are used to evaluate the machine learning algorithm used.

This post needs some prior understanding of the confusion matrix and would recommend you to go through it here.

Example of Animal Image Recognition

Consider the below Confusion Matrix for the input of the animal images and algorithm trying to identify the animal correctly:

 ANIMALS Cat Dog Leopard Tiger Jaguar Puma Cat 62 2 0 0 1 0 Dog 1 50 1 0 4 0 Leopard 0 2 98 4 0 0 Tiger 0 0 10 78 2 0 Jaguar 0 1 8 0 46 0 Puma 2 0 0 1 1 42

Explaining Few Random Grids:

• [Cat, Cat]: The grid is having the value 62. It means the image of a cat was identified as a cat for 62 times.
• [Cat, Dog]: The grid is having the value 2. It means the image of a cat was identified as dog twice.
• [Leopard, Tiger]: The grid is having the value 4. It means the image of Leopard was identified as Tiger for 4 times.

Q.How many times our algorithm predicted the image to be Tiger?

A. Looking at the Tiger column: 0+0+4+78+0+1 = 83

Q.What is the probability that Puma will be classified correctly?

A. Looking at the matrix above, we can see that we 42 times Puma was classified correctly. But twice it was classified as Cat, once as Tiger and once as Jaguar. So the probability will come down as: 42/(42+2+1+1)= 42/46 = 0.91

This concept is called as RECALL. It is the fraction of correctly predicted positives out of all actual positives. So we can say that Recall = (True Positives) / (True Positives + False Negatives)

Q. What is the probability that when our algorithm is identifying the image as Cat, it is actually Cat?

A. Looking at the matrix above, we can see that once our algorithm has identified a Dog as a Cat, twice Puma as Cat and 62 times Cat as Cat. So the probability will come down as: 62/(62+1+2) = 0.95

This concept is called as PRECISION. It is the fraction if correctly predicted positives out of all predicted positives. So we can say that Precision = (True Positives) / (True Positives + False Positives)