We should implement NN modules as functions to allow building models with dynamically generated weights.
Motivation
As discussed in #3190, EvolveGCN is a typical example where GCN's weight is generated by an LSTM. While dgl.nn.GraphConv does allow the user specifying the weights in the forward pass, other modules do not.
This is also an issue for PyTorch modules in general, and PyTorch only functionizes some of the modules like conv2d or prelu that are most likely needed for dynamic weights. So I guess the priority to implement this is reasonably low.
This issue bookkeeps the feature request.
Alternatives
One could copy the implementation and change the weight initialization to use the generated weights.
🚀 Feature
We should implement NN modules as functions to allow building models with dynamically generated weights.
Motivation
As discussed in #3190, EvolveGCN is a typical example where GCN's weight is generated by an LSTM. While
dgl.nn.GraphConv
does allow the user specifying the weights in the forward pass, other modules do not.This is also an issue for PyTorch modules in general, and PyTorch only functionizes some of the modules like
conv2d
orprelu
that are most likely needed for dynamic weights. So I guess the priority to implement this is reasonably low.This issue bookkeeps the feature request.
Alternatives
One could copy the implementation and change the weight initialization to use the generated weights.