codecaution / Awesome-Mixture-of-Experts-Papers

A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
524 stars 40 forks source link

Uni-Perceiver-MoE on NIPS2022 #10

Closed Lechatelia closed 1 year ago

Lechatelia commented 1 year ago

Hey! Thank you for your work. Could you add our MoE work for generalist models in NIPS 2022? Uni-Perceiver-MoE: Learning Sparse Generalist Models with Conditional MoEs. [paper] [code] This work uses MoE to mitigate task interference in multitask training and proposes routing strategies to make MoE more efficient.

Thank you so much!

codecaution commented 1 year ago

Thanks for your suggestion and I have already added it.