codecaution / Awesome-Mixture-of-Experts-Papers

A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
524 stars 40 forks source link

Three new papers about MoE #6

Closed XueFuzhao closed 2 years ago

XueFuzhao commented 2 years ago

Hi authors, Thank you for your repo! I also created one awesome MoE repo recently. I will update your new work into my repo.
https://github.com/XueFuzhao/awesome-mixture-of-experts Also, I think a few of my papers are missing. Go Wider Instead of Deeper [AAAI2022] Cross-token Modeling with Conditional Computation [5 Sep 2021] One Student Knows All Experts Know: From Sparse to Dense [26 Jan 2022]

Thank you so much!

codecaution commented 2 years ago

Hi, XueFu. Thanks for your suggestions! I have already added your papers in my list and add the contents section as your repo.

Please also include our system paper HetuMoE: HetuMoE: An Efficient Trillion-scale Mixture-of-Expert Distributed Training System [pdf] [github] Thanks!

XueFuzhao commented 2 years ago

No problem! Done