SIMEXP / gcn_package

The lab repository for GCN
MIT License
0 stars 2 forks source link

The implementation of ReLU in YuGCN #9

Open htwangtw opened 2 years ago

htwangtw commented 2 years ago

In the current implementation ReLU is called as a function after each convolution layer. The guided back-propagation tutorial I can find online are applying the hook function when detecting the ReLU function implemented as a module. I am not sure what would be the right way to modify YuGCN to make this process easier. cc @ltetrel

htwangtw commented 2 years ago

Most of the tutorial I found uses AlexNet as the pretrain model so I would consider refactor the code closer to it: https://github.com/pytorch/vision/blob/main/torchvision/models/alexnet.py We might even be able to reuse some of the existing guided backprop modules

ltetrel commented 2 years ago

Sorry I am not sure to follow ? Instead of having an independent hook function, you would like to integrate it directly as a custom relu layer ?

htwangtw commented 2 years ago

Sorry I am not sure to follow ? Instead of having an independent hook function, you would like to integrate it directly as a custom relu layer ?

I actually don't know if I have the right idea to describe the problem correctly. I am not integrating the hook function.

In all pytorch based code I can find, they look for a relu module in the model and apply the hook function. In the current implementation of YuGCN, it cannot find the relu module because it's simply not there

htwangtw commented 2 years ago

I found this tutorial using captum to visualise GCN built with torch_geometric: https://colab.research.google.com/drive/1fLJbFPz0yMCQg81DdCP5I8jXw9LoggKO?usp=sharing

I am going to try it out

ltetrel commented 2 years ago

It is hard to find online documentation on this type of visualization methods, I always hit stuff about digital marketing... But if you have an explanation I would be happy to hear it :)