kaifishr / PyTorchRelevancePropagation

A basic implementation of Layer-wise Relevance Propagation (LRP) in PyTorch.
https://kaifishr.github.io/2021/12/15/relevance-propagation-pytorch.html
75 stars 4 forks source link

question in lrp_layers.py #3

Open S200331082 opened 1 year ago

S200331082 commented 1 year ago

hi @kaifishr Thanks for your implementation. I'm trying to reimplement lrp on Resnet50, but it has a BatchNorm2D layer in the backbone, I'm a freshman in python and I don't know how to code the RelevancePropagationBatchNorm2D in lrp_layers.py. Can you just give me some ideas? Thanks a lot.

kaifishr commented 1 year ago

As BatchNorm2D consists of two consecutive affine linear transformations I would try to weight the relevance scores by the weight parameters of the batch normalization layer learned during the training.

zbb2022hust commented 1 year ago

Hello, I also meet problems when calculating relevance via the BatchNorm1D layer. I'm not professional on math, but I'm in urgent to use this method to evaluate my FCN model in data-driven fault diagnosis task. Could you add the RelevancePropagationBatchNorm1d/2d in lrp_layers or explain more clear on how to calculate this? Thanks! Best regards.

peacefulotter commented 1 month ago

Bumping this as BatchNorm2D is required for ResNet models!