codecaution / Awesome-Mixture-of-Experts-Papers

A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
533 stars 41 forks source link

Add MoEfication & PLE #4

Closed ZhengZixiang closed 2 years ago

codecaution commented 2 years ago

@ZhengZixiang Thanks for your PR! Please add these two papers back to the list and commit it. I notice that you remove them in your PR.