codecaution / Awesome-Mixture-of-Experts-Papers

A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
524 stars 40 forks source link

Update README.md #5

Closed LQBDD closed 2 years ago

LQBDD commented 2 years ago

update 3 paper:

  1. Towards More Effective and Economic Sparsely-Activated Model
  2. Alpa: Automating Inter- and Intra-Operator Parallelism for Distributed Deep Learning
  3. Pathways: Asynchronous Distributed Dataflow for ML