Closed drisyack closed 5 years ago
@drisyack im also having the same issue! can any one help on this regard
Can you guys provide a little bit more context to see where the problem might be?
seems the same issue: Confusion Matrix: [[ 0. 196. 0. 0. 0. 0. 21.] [ 0. 0. 158. 0. 0. 0. 0.] [ 3. 0. 0. 169. 0. 0. 87.] [ 1. 6. 0. 0. 71. 0. 55.] [ 1. 0. 0. 0. 0. 88. 7.] [ 0. 0. 0. 0. 0. 0. 0.] [594. 33. 2. 7. 11. 3. 141.]]
precision_mold@0.5IOU: 0.00 recall_mold@0.5IOU: 0.00 precision_burned@0.5IOU: 0.00 recall_burned@0.5IOU: 0.00 precision_damp@0.5IOU: 0.00 recall_damp@0.5IOU: 0.00 precision_leak@0.5IOU: 0.00 recall_leak@0.5IOU: 0.00 precision_copper leakage@0.5IOU: 0.00 recall_copper leakage@0.5IOU: 0.00 confusion_matrix.py:115: RuntimeWarning: invalid value encountered in double_scalars recall = float(confusion_matrix[id, id] / total_target) precision_oxidation@0.5IOU: 0.00 recall_oxidation@0.5IOU: nan
I think the root cause is about "groundtruth_classes" index is from 0. but "detection_classes" index is from 1.
I don't really understand the confusion matrix, isn't the numbers supposed to be confidence score (0<1) what is the 118. in the first row says here?
Confusion Matrix:
[[118. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 7.]
[ 0. 10. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[ 0. 0. 13. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
[ 0. 0. 0. 18. 0. 0. 1. 0. 0. 0. 0. 2.]
[ 0. 2. 0. 0. 19. 0. 0. 0. 0. 0. 0. 10.]
[ 1. 0. 0. 0. 0. 21. 0. 0. 0. 0. 0. 0.]
[ 1. 0. 0. 0. 2. 2. 14. 0. 0. 0. 0. 2.]
[ 0. 0. 0. 0. 0. 0. 0. 21. 0. 0. 0. 0.]
[ 0. 0. 0. 0. 0. 1. 0. 0. 18. 0. 0. 2.]
[ 0. 0. 0. 0. 1. 1. 1. 0. 0. 19. 0. 1.]
[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 18. 0.]
[ 2. 1. 0. 2. 0. 0. 0. 0. 0. 0. 1. 0.]]
precision_Rice@0.5IOU: 0.97
recall_Rice@0.5IOU: 0.94
precision_Burger@0.5IOU: 0.77
recall_Burger@0.5IOU: 1.00
precision_Pizza@0.5IOU: 1.00
recall_Pizza@0.5IOU: 0.93
precision_Steak@0.5IOU: 0.90
recall_Steak@0.5IOU: 0.86
precision_French_Fries@0.5IOU: 0.86
recall_French_Fries@0.5IOU: 0.61
precision_Nasi_Goreng@0.5IOU: 0.84
recall_Nasi_Goreng@0.5IOU: 0.95
precision_Ayam_Goreng@0.5IOU: 0.88
recall_Ayam_Goreng@0.5IOU: 0.67
precision_Laksa@0.5IOU: 1.00
recall_Laksa@0.5IOU: 1.00
precision_Mei_Goreng@0.5IOU: 1.00
recall_Mei_Goreng@0.5IOU: 0.86
precision_Curry_Puff@0.5IOU: 1.00
recall_Curry_Puff@0.5IOU: 0.83
precision_Boiled_Eggs@0.5IOU: 0.95
recall_Boiled_Eggs@0.5IOU: 1.00
The confusion matrix doesn't show confidence values. I'd suggest you check out this link for an understanding of what a confusion matrix shows: https://www.dataschool.io/simple-guide-to-confusion-matrix-terminology/
Thanks a lot. Your code helped me immensely. However, I do have a few doubts about the confusion matrix part,
Remember that the confusion matrix tracks the results of each image. If you want the "confusion matrix" of each image separate, you'll obtain a matrix with a single value inside. This wouldn't make much sense.
The confusion matrix includes the results of all the images in the dataset.
while I am running the code ,confusion matrix is printing with full of zeros and precision and recall is not printing .it shows error.please let me know what is the mistake I am made here. Thankyou