Closed dododo0611 closed 2 years ago
nice catch, that is somewhere doesn't make perfect sense. I think I tried using sum
only to find that sum
make log_prob
much smaller (as it's negative), so that prob = torch.exp(log_prob)
is too small and the final performance dropped.
For easier understanding, AUROC isn't affected by the point, theorically. (The threshold is not given and the order of score of each pixel is kept.) So the reason why the performance dropped might be floating point error. Hope it makes sense.
nice catch, that is somewhere doesn't make perfect sense. I think I tried using
sum
only to find thatsum
makelog_prob
much smaller (as it's negative), so thatprob = torch.exp(log_prob)
is too small and the final performance dropped.
Thanks, I also find the performance becomes weird after I replace the operation with summation, so I come here to create the issue :).
For easier understanding, AUROC isn't affected by the point, theorically. (The threshold is not given and the order of score of each pixel is kept.) So the reason why the performance dropped might be floating point error. Hope it makes sense.
I think this explanation makes sense and the mean value won't affect the ultimate result.
Thanks all!
@cytotoxicity8 another reason is that anomaly_score from different feature maps are averaged at the end. Perhaps there's one level presenting dominant anomaly score whereas the others don't. But exp
make all values much smaller so the true positive is not distinguishable.
Thanks for the amazing repository! I have a tiny doubt concerning the log likelihood calculation. https://github.com/gathierry/FastFlow/blob/d275b79d47d6e272115d45fd7fc0f29cca0f5107/fastflow.py#L148 I don't figure out why using mean of the channel to calculate the log likelihood cause the summation is usually used just like https://github.com/gathierry/FastFlow/blob/d275b79d47d6e272115d45fd7fc0f29cca0f5107/fastflow.py#L139-L141 Thanks!