Closed velix closed 2 years ago
A complete graph may not work with graph attention network. A node i is used to query its neighbors in a graph attention network. However, a complete graph has every node be the neighbor of all other nodes. The query happens to each node over the rest nodes. This lead to the same embedding for each node.
If you have any questions, it's welcome to reopen the issue.
I am trying to train a 3 layer GAT network with multiple heads on a complete graph of 100 nodes. Each node represents a point in space, so the node's features are its coordinates in [0,1]. The output of the network is the same embedding for each of the nodes, a
(128, 100)
matrix with each column identical to the rest.A toy example
which outputs the same 16d embedding for each node.
Am I misunderstanding something about GATs here, or does the GATConv implementation not work with complete graphs?