Open Debangan-MishraIIIT opened 4 months ago
This is similar to any GNN layer. Only the aggregation of neighbors and root node is a bit different (although it has the same expressivity as SAGEConv
since - \alpha W
can simply be substituted by W
):
out = self.propagate(x=x, edge_index)
out = self.lin(out)
return x - alpha * out
🚀 The feature, motivation and pitch
This paper: https://arxiv.org/abs/2301.11164 proposes a negative message passing scheme which seems to perform well for heterophilic graphs, and the primary difference in their message passing scheme is via the following equation:
In a conventional GNN, the embeddings of a node are obtained by aggregating the features of the neighbours in a permutation invariant manner. However the above method requires the embeddings of the neigbours to be subtracted from the node embeddings with α as a trainable parameter.
Is there any way to achieve the above (and similar highly customizable aggregations) in pytorch-geometric? Currently only sum, mean and average seems to be allowed. Looking at the previous issues did not help as none of the custom aggregations are dealing with the embeddings directly and involve MLP or LSTM aggregations which are a blackbox.
Alternatives
No response
Additional context
No response