graphdeeplearning / graphtransformer

Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
https://arxiv.org/abs/2012.09699
MIT License
872 stars 134 forks source link

checkpoints of pretrained models #16

Open AmeenAli opened 2 years ago

AmeenAli commented 2 years ago

Hello, Thanks for sharing this amazing work! Any chance you can share the pretrained models used in the paper ? Thanks!

vijaydwivedi75 commented 2 years ago

Hi @AmeenAli, the models that we train are not on any large datasets (and rather medium-scale benchmark datasets), and we do not apply the checkpoints on any transfer learning, we did not share the model checkpoints.

You may use the scripts in the repo to reproduce the checkpoints in <24h time for 1 model, as that is the max time we consider while training.