issues
search
codecaution
/
Awesome-Mixture-of-Experts-Papers
A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
524
stars
40
forks
source link
Update README.md
#5
Closed
LQBDD
closed
2 years ago
LQBDD
commented
2 years ago
update 3 paper:
Towards More Effective and Economic Sparsely-Activated Model
Alpa: Automating Inter- and Intra-Operator Parallelism for Distributed Deep Learning
Pathways: Asynchronous Distributed Dataflow for ML
update 3 paper: