Closed Gaurangkarwande closed 2 years ago
This is a great point, but it will involve a lot of work. Let me know if you want to help me with that.
In general, The input of a GNN
layer will be [num_nodes, in_channels]
and will output a tensor of shape [num_nodes, out_channels]
.
Yeah, I sure can contribute. I am new to GNNs but I can try adding on the documentation for the models I know of are have used.
Super. Let's start with a single layer, e.g., GraphConv
and go from there. Let me know if you need any more guidance.
Should edge_index
and optional tensors like edge_weight
be acknowledged as inputs as well?
Yeah, that would be something like:
edge_index: (2, num_edges)
edge_weight: (num_edges)
Here is a proposal for GraphConv
:
I've tried to follow as closely as possible the conventions already established in PyTorch, as well as mathematical notations present in the literature. Along these guidelines, the x
and edge_index
are proxied by the calligraphic typeface V and E respectively, while enclosure with vertical bars (|) refers to the number of elements.
It would also be possible to ditch the mathematical notation altogether in favor of more agnostic solutions, but considering that this framework is specifically aimed towards GNN, I believe that some mathematic rigor can be justified.
Of course, I would love to hear the community's opinion 😄.
I think this is great :) Please feel free to contribute!
Can you add to the documentation the input shape to a layer and the corresponding output shape after forward() has been executed. This is similar to the Pytorch documentation.
For e.g. - the documentation for 2D Convolution layer, nn.torch.Conv2d( ), states that: input = (N, Cin, H, W) output = (N, Cout, Hout, Wout)
Motivation
I am fairly new to GNNs and pytorch-geometric. I think it will be a lot easier and smooth for newbies/noobies like me to get a sense of the dimensional transformation within layer.
Additional context
nn.torch.Conv2d( ) -