Closed InfluenceFunctional closed 5 months ago
get out of gnn TransformerConv
also - the softmax utility for the attention calculation appears slow and to have superlinear scaling with the batch size
get out of gnn TransformerConv