rampasek / GraphGPS

Recipe for a General, Powerful, Scalable Graph Transformer
MIT License
677 stars 121 forks source link

Graph generation #24

Closed phshah95 closed 1 year ago

phshah95 commented 1 year ago

I am attempting to use this to analyze chess games represented as graphs. Is it possible to modify this model for graph generation?

rampasek commented 1 year ago

Hi! Unfortunately I don't have an experience with graph generation. GPS and other GNNs/GTs could be used for the graph representation, but honestly I would expect CNNs or Transformers with tokenization similar to ViT to be a better fit given the regular structure and small size of the chess board grid. Best, Ladislav

phshah95 commented 1 year ago

Oh okay gotcha. Just wondering, what is the input/prediction of your model? Graph transformer novice here 😁

rampasek commented 1 year ago

No problem. The input is a graph as a collection of (attributed) nodes and (attributed) edges between them. The output is task-dependent; it can be a global graph property (1 graph -> 1 output), individual node properties (1 graph with N nodes -> N outputs), link prediction (for two nodes in a graph, what is the probability there should be an edge between them?), and other. Typically for the GPS model, we assume so call inductive regime, when we learn on one (large) set of graphs and test on another set of graphs. But there is also a whole world of transductive learning or 1-graph regime, where the input is one (large) graph and we need to generalize from a labeled set of nodes to an unlabeled set of nodes in this single graph. I hope that helps!