issues
search
pjlab-sys4nlp
/
llama-moe
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
https://arxiv.org/abs/2406.16554
Apache License 2.0
849
stars
44
forks
source link
CPT: add meta info when tokenization
#37
Closed
Spico197
closed
10 months ago
Spico197
commented
10 months ago
add meta info when tokenization
add
gate_load_vis.py
for gate load visualization
add learnable scale factor
DaizeDong
commented
10 months ago
Approved
gate_load_vis.py
for gate load visualization