OpenMOSS / AnyGPT

Code for "AnyGPT: Unified Multimodal LLM with Discrete Sequence Modeling"
629 stars 43 forks source link

hi,when will the pre-train related codes&scripts be released? #18

Open XL2248 opened 2 months ago

XL2248 commented 2 months ago

I see you have released the mmpretrain file. When will you plan to release all the code and scripts? I believe this will benefit the community a lot and gain much researchers' interest in your great work like MOSS. Besides, how many A100 GPUs were used in the pre-training stage, and how much time did it cost?

JunZhan2000 commented 2 months ago

In fact, you only need to use the code for training LLM. I will sort out the code and release it in the next two weeks. The pre-training took 96 80G A100 for a week, and the instruction fine-tuning was much less.

XL2248 commented 1 month ago

Hi, are the codes and scripts ready to release?