utkuozbulak / pytorch-cnn-visualizations

Pytorch implementation of convolutional neural network visualization techniques
MIT License
7.81k stars 1.49k forks source link

Implementation of guided backpropagation #97

Closed huihui-v closed 3 years ago

huihui-v commented 3 years ago

Hey!

Thanks for your great project. I'm working on some visualization stuff now, and I was trying to re-implement guided backpropagation on Resnet referring to your code. Here are my problems:

https://github.com/utkuozbulak/pytorch-cnn-visualizations/blob/16eddfa055a9c618ba548e9fb4529e2ccbc79c35/src/guided_backprop.py#L35

According to your implementation, you use a forward hook function to record the output of the ReLU layers, then use them to multiply on the "clamped gradient". In this way, the gradients on the positions that the original output of the ReLU layer will equal 0.

But I think the autograd of the ReLU layer will handle it automatically. So in my opinion, it is the same if we abort the whole forward hook function here (i.e., we don't record the output of ReLU in the forwarding procedure) and only use modified_grad_out = torch.clamp(grad_in[0], min=0.).

It's my first time to use hook functions in PyTorch, so there may be some tricks that I didn't know. If you have some special views on this problem, please let me know!

huihui-v commented 3 years ago

Sorry, I just checked the closed issues (https://github.com/utkuozbulak/pytorch-cnn-visualizations/issues/36#issuecomment-451354923) and found the answer. I'll check out the paper and close this issue.