pjlab-sys4nlp / llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
https://arxiv.org/abs/2406.16554
Apache License 2.0
849 stars 44 forks source link

CPT: add more args and exec scripts #39

Closed Spico197 closed 9 months ago

Spico197 commented 9 months ago
DaizeDong commented 9 months ago

well done