issues
search
pjlab-sys4nlp
/
llama-moe
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
https://arxiv.org/abs/2406.16554
Apache License 2.0
849
stars
44
forks
source link
我们才能从llama13b开始训练moe呢?
#60
Closed
xyjsjruiliu
closed
6 months ago
xyjsjruiliu
commented
7 months ago
如题
xyjsjruiliu
commented
6 months ago
已解决
如题