issues
search
LINs-lab
/
DynMoE
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
https://arxiv.org/abs/2405.14297
Apache License 2.0
50
stars
9
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
EMoE Language Evaluation
#3
caichaoxiang
opened
3 days ago
1
MoE-LLaVA实验的llava-bench-in-the-wild的evaluation
#2
sharkdrop
closed
1 month ago
4
GMoE实验的Evaluation
#1
sharkdrop
closed
1 month ago
4