HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
493 stars 36 forks source link

关于图编码器的问题 #46

Closed serendipity800 closed 4 months ago

serendipity800 commented 6 months ago

在GraphGPT论文中,关于图编码器用的是graph transformer,给出的引用是[62] Seongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang, and Hyunwoo J Kim.

  1. Graph transformer networks. In NeurIPS, Vol. 32(第[61]号引用也是这个)。 这个图网络叫GTN,不是一个典型的transformer架构的网络。然而,本仓库的开源代码中却使用了一个带有位置编码、图结构和MHA的GNN(类似于github.com/HKUDS/GFormer),引用是不是写错了。
HKUDS commented 4 months ago

Thank you for your interest in our research work. Our model is designed to be flexible and can accommodate various graph neural networks (GNNs) as graph encoders. The choice of a specific implementation for the graph encoder may vary depending on the dataset and specific scenarios. As a result, the architecture of the graph transformer can be tailored to meet specific requirements.