Hey! Thank you for your work.
Could you add our MoE work for generalist models in NIPS 2022?
Uni-Perceiver-MoE: Learning Sparse Generalist Models with Conditional MoEs. [paper][code]
This work uses MoE to mitigate task interference in multitask training and proposes routing strategies to make MoE more efficient.
Hey! Thank you for your work. Could you add our MoE work for generalist models in NIPS 2022? Uni-Perceiver-MoE: Learning Sparse Generalist Models with Conditional MoEs. [paper] [code] This work uses MoE to mitigate task interference in multitask training and proposes routing strategies to make MoE more efficient.
Thank you so much!