HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
521 stars 39 forks source link

The url of graph trasformer needs to be checked #4

Closed wangjade54241 closed 10 months ago

wangjade54241 commented 10 months ago

Sorry to bother you... The url of graph transformer seems like the same as GraphGPT. You mean [https://github.com/seongjunyun/Graph_Transformer_Networks] ?

wangjade54241 commented 10 months ago

Please also check all_graph_data.pt, thanks.

zhihui-shao commented 10 months ago

Was LLM not fine-tuned at either stage in the paper?

tjb-tech commented 10 months ago
Sorry for the missing updates in the main body of README.md and we have updated the link. Actually, you can kindly refer to the table for all resources you may need in the very begining of the README.md. Thank you again for your attention on our GraphGPT. 🤗 Huggingface Address 🎯 Description
huggingface.co/Jiabin99/GraphGPT-7B-mix-all It's the checkpoint of our GraphGPT based on Vicuna-7B-v1.5 tuned on instruction data Arxiv-PubMed-mix-NC-LP
huggingface.co/Jiabin99/Arxiv-PubMed-GraphCLIP-GT It's the checkpoint of the pre-trained graph transformer (GT) trained on Arxiv and PubMed using Text-Graph grounding.
huggingface.co/datasets/Jiabin99/Arxiv-PubMed-mix-NC-LP This's the mixing instruction dataset with node classification (NC) and link prediction (LP) on Arxiv and PubMed.
huggingface.co/datasets/Jiabin99/GraphGPT-eval-instruction We release all instruction dataset for our evaluation.
huggingface.co/datasets/Jiabin99/All_pyg_graph_data We merge all utilized graph data.
huggingface.co/datasets/Jiabin99/graph-matching This is the instruction data used in graph-matching stage.