Open mjack3 opened 2 years ago
The last parameter of the confusion matrix is the threshold. The lower threshold, the more positive samples are. The threshold left in the comment is way too low. I set it to 2.0 for my own data and the confusion matrix looks much normal. However, I guess reporting positive is not that helpful for this method. It gives the sample amap image with localization information anyway. Maybe that is why those code were commented out?
Hello.
I noticed that the model always says "this is image has a deffect" when testing. You can see a commented part of the code where the autor calculates the confusion matrix. This is what i get for the category hazelnut
Note that the 40 means that the model inferenced 40 samples as deffected object instead of 'good'.
My args:
I am testing with the last version of Pytorch (v11). There is something wrong?