issues
search
pjlab-sys4nlp
/
llama-moe
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
https://arxiv.org/abs/2406.16554
Apache License 2.0
883
stars
46
forks
source link
CPT: add eval support
#35
Closed
Spico197
closed
1 year ago
Spico197
commented
1 year ago
tb: add TGS, MFU, update load&importance logging step
add eval support, fix balance_loss = None bug during eval grad ckpt
update logging strategy