hcw-00 / PatchCore_anomaly_detection

Unofficial implementation of PatchCore anomaly detection
Apache License 2.0
317 stars 95 forks source link

Using layer4 #27

Closed stvogel closed 2 years ago

stvogel commented 2 years ago

Thanks for this great implementation. In the paper the authors claim in "Evaluation on other benchmarks":

As the detection context is much closer to that of natural image data available in ImageNet and images are larger, we make use of deeper network feature maps at hierarchy levels 3 and 4, but otherwise do not perform any hyperparameter tuning for PatchCore.

I tried to simply swap layer2 and 3 to 3 and 4 and had to adapt the reshaping later on. But the results were a complete failure. The heatmap is always blank.

Has someone tried to use layer4 successfully?

royarahimzadeh commented 2 years ago

I have tried layers 3,4 in the industrial version of this code : (https://github.com/dhkdnduq/PatchCore_anomaly_detection) like this: self.model.layer3[-1].register_forward_hook(hook_t) self.model.layer4[-1].register_forward_hook(hook_t)

For my input images, the results got lower auc_roc than layer2,3.

stvogel commented 2 years ago

Thanks for your comment, @royarahimzadeh .

I also had to adapt the reshape here (when I switched to only layer3 and 4): anomaly_map = score_patches[:, 0].reshape((28, 28))

But on some image-datasets the numbers in the score_patches were too high, so that w = (1 - (np.max(np.exp(N_b)) / np.sum(np.exp(N_b)))) reached NaN.

And I also tried to use all three layers (2, 3 and 4). In this case I didn't have to adapt the reshaping of course. But the results also got a tiny lower auc_roc.

stvogel commented 2 years ago

As this wasn't really an issue but a point of interest, I simply close it.