pyg-team / pytorch_geometric

Graph Neural Network Library for PyTorch
https://pyg.org
MIT License
20.94k stars 3.61k forks source link

Custom Negative Message Passing for Heterophilic Graphs #9529

Open Debangan-MishraIIIT opened 1 month ago

Debangan-MishraIIIT commented 1 month ago

🚀 The feature, motivation and pitch

This paper: https://arxiv.org/abs/2301.11164 proposes a negative message passing scheme which seems to perform well for heterophilic graphs, and the primary difference in their message passing scheme is via the following equation: image

In a conventional GNN, the embeddings of a node are obtained by aggregating the features of the neighbours in a permutation invariant manner. However the above method requires the embeddings of the neigbours to be subtracted from the node embeddings with α as a trainable parameter.

Is there any way to achieve the above (and similar highly customizable aggregations) in pytorch-geometric? Currently only sum, mean and average seems to be allowed. Looking at the previous issues did not help as none of the custom aggregations are dealing with the embeddings directly and involve MLP or LSTM aggregations which are a blackbox.

Alternatives

No response

Additional context

No response

rusty1s commented 3 weeks ago

This is similar to any GNN layer. Only the aggregation of neighbors and root node is a bit different (although it has the same expressivity as SAGEConv since - \alpha W can simply be substituted by W):


out = self.propagate(x=x, edge_index)
out = self.lin(out)
return x - alpha * out