HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
636 stars 59 forks source link

A question about training weights of embedding #34

Closed AriKing11 closed 9 months ago

AriKing11 commented 11 months ago

I have a question about the training weights of embedding. I used my own datasets to process stage 1 (which includes tuning the embedding weights of new graph tokens, e.g. DEFAULT_GRAPH_TOKEN = ""), but the weights became Nan instantly, I don't know why. Thanks for your patience.

tjb-tech commented 10 months ago

I have a question about the training weights of embedding. I used my own datasets to process stage 1 (which includes tuning the embedding weights of new graph tokens, e.g. DEFAULT_GRAPH_TOKEN = ""), but the weights became Nan instantly, I don't know why. Thanks for your patience.

Thanks for you interests! May I ask details of your error? Do the weights become Nan or loss become Nan?

msy0513 commented 3 months ago

I have a question about the training weights of embedding. I used my own datasets to process stage 1 (which includes tuning the embedding weights of new graph tokens, e.g. DEFAULT_GRAPH_TOKEN = ""), but the weights became Nan instantly, I don't know why. Thanks for your patience.

Thanks for you interests! May I ask details of your error? Do the weights become Nan or loss become Nan?

我更换数据后再stage1和stage2都得到了train_loss=nan,这是正常情况么?该怎么解决这个问题呢?