Open shubhamgarg21 opened 11 months ago
We have released the training code of SEED-LLaMa, including SEED tokenizer, Multimodal LLM pretraining and instruction tuning. Our Multimodal LLM training codebase supports 1. large-scale multi-node training with deepspeed 2. highly-efficient multiple training datapipes.
Hi, I am wondering if at any point the training code for SEED-LLaMA will be made available?