codecaution / Awesome-Mixture-of-Experts-Papers

A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
533 stars 41 forks source link

FasterMoE paper on PPoPP'22 #7

Closed laekov closed 2 years ago

laekov commented 2 years ago

There is a new distributed system FasterMoE: modeling and optimizing training of large-scale dynamic pre-trained models published on PPoPP'22. Please kindly consider including this paper in your list.

FYI, we have also included your MoE systems and paper collections on FastMoE's homepage

codecaution commented 2 years ago

Thanks for your suggestions!