Community Tip - Visit the PTCooler (the community lounge) to get to know your fellow community members and check out some of Dale's Friday Humor posts! X
Precision and Recall are the evaluation matrices that are used to evaluate the machine learning algorithm used.
This post needs some prior understanding of the confusion matrix and would recommend you to go through it here.
Example of Animal Image Recognition
Consider the below Confusion Matrix for the input of the animal images and algorithm trying to identify the animal correctly:
ANIMALS |
Cat |
Dog |
Leopard |
Tiger |
Jaguar |
Puma |
Cat |
62 |
2 |
0 |
0 |
1 |
0 |
Dog |
1 |
50 |
1 |
0 |
4 |
0 |
Leopard |
0 |
2 |
98 |
4 |
0 |
0 |
Tiger |
0 |
0 |
10 |
78 |
2 |
0 |
Jaguar |
0 |
1 |
8 |
0 |
46 |
0 |
Puma |
2 |
0 |
0 |
1 |
1 |
42 |
Explaining Few Random Grids:
Questions To Find Some Answers
Q.How many times our algorithm predicted the image to be Tiger?
A. Looking at the Tiger column: 0+0+4+78+0+1 = 83
Q.What is the probability that Puma will be classified correctly?
A. Looking at the matrix above, we can see that we 42 times Puma was classified correctly. But twice it was classified as Cat, once as Tiger and once as Jaguar. So the probability will come down as: 42/(42+2+1+1)= 42/46 = 0.91
This concept is called as RECALL. It is the fraction of correctly predicted positives out of all actual positives. So we can say that Recall = (True Positives) / (True Positives + False Negatives)
Q. What is the probability that when our algorithm is identifying the image as Cat, it is actually Cat?
A. Looking at the matrix above, we can see that once our algorithm has identified a Dog as a Cat, twice Puma as Cat and 62 times Cat as Cat. So the probability will come down as: 62/(62+1+2) = 0.95
This concept is called as PRECISION. It is the fraction if correctly predicted positives out of all predicted positives. So we can say that Precision = (True Positives) / (True Positives + False Positives)