pjlab-sys4nlp / llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
https://arxiv.org/abs/2406.16554
Apache License 2.0
883 stars 46 forks source link

Data clustering: add tokenization for clustered data, fix training & eval bugs in `moe_gates.py` #24

Closed Spico197 closed 1 year ago