withinmiaov / A-Survey-on-Mixture-of-Experts

MIT License
86 stars 4 forks source link

How about add DynMoE to your survey? #1

Closed QAQdev closed 3 weeks ago

QAQdev commented 1 month ago

We propose Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models, a routing mechanism which allows for a variable number of experts per token as well as a procedure for dynamically changing the number of experts during training.

I think this may match your survey's focus! Please kindly consider including this paper.

We also provide our implementation at LINs-lab/DynMoE.

withinmiaov commented 1 month ago

Thanks for providing the information. We will check it out and try to include it in our revision.

QAQdev commented 1 month ago

Thanks a lot!