AILab-CVC / SEED

Official implementation of SEED-LLaMA (ICLR 2024).
https://ailab-cvc.github.io/seed
Other
515 stars 30 forks source link

Codebook Training Epochs #35

Open Revliter opened 2 months ago

Revliter commented 2 months ago

Hello, Congratulations on the successful development of the SEED model! I am impressed by its ability and wanna to reproduce it locally. However, I am encountering some confusing problems. The config of the codebook training of the seed tokenizer says that it takes up to 500 epochs training upon 500m data. I am wondering is it the right config of the codebook training since it takes so many gpu hours to finish this. It would be thankful if you can clarify this or provide some advice. Thanks for your generous help.

luohao123 commented 2 months ago

Have got data?