Open htwangtw opened 2 years ago
Most of the tutorial I found uses AlexNet as the pretrain model so I would consider refactor the code closer to it: https://github.com/pytorch/vision/blob/main/torchvision/models/alexnet.py We might even be able to reuse some of the existing guided backprop modules
Sorry I am not sure to follow ? Instead of having an independent hook function, you would like to integrate it directly as a custom relu layer ?
Sorry I am not sure to follow ? Instead of having an independent hook function, you would like to integrate it directly as a custom relu layer ?
I actually don't know if I have the right idea to describe the problem correctly. I am not integrating the hook function.
In all pytorch based code I can find, they look for a relu module in the model and apply the hook function. In the current implementation of YuGCN, it cannot find the relu module because it's simply not there
I found this tutorial using captum
to visualise GCN built with torch_geometric
: https://colab.research.google.com/drive/1fLJbFPz0yMCQg81DdCP5I8jXw9LoggKO?usp=sharing
I am going to try it out
It is hard to find online documentation on this type of visualization methods, I always hit stuff about digital marketing... But if you have an explanation I would be happy to hear it :)
In the current implementation ReLU is called as a function after each convolution layer. The guided back-propagation tutorial I can find online are applying the hook function when detecting the ReLU function implemented as a module. I am not sure what would be the right way to modify YuGCN to make this process easier. cc @ltetrel