HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
576 stars 53 forks source link

Missing the code for structure-text grounding on many graphs #20

Closed AGTSAAA closed 10 months ago

AGTSAAA commented 11 months ago

Hi,

Thank you very much for your work. Could please provide the pre-training code for structure-text grounding on large graphs such as Arxiv, as we feel only providing the checkpoint is hard to reproduce the experimental results on large graphs.

yuh-yang commented 11 months ago

Hi!

The codes are at ./text-graph-grounding and can be directly applied to Arxiv. You can refer to the example data (Cora) and prepare our released Arxiv data to match the format.