HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
493 stars 36 forks source link

关于文本特征 #28

Closed serendipity800 closed 7 months ago

serendipity800 commented 7 months ago

It is really a very original and amazing work! I have a few questions to consult. How to you transoform the 768 / 1024 dimension BERT embedding to 128 dimension embedding for original node embedding? Thanks for your replying. Happy Christmas!

tjb-tech commented 7 months ago

Merry Christmas! We use this version: https://huggingface.co/google/bert_uncased_L-2_H-128_A-2.