Closed jiangsongtao closed 2 months ago
Hi Eric, thanks a lot for your interest in our work!
Whilst we didn't focus on autoregressive models/LLMs in the pre-print, as far as model form goes, MMoEs are absolutely a viable alternative layer choice anywhere you already perform conditional computation through a regular MoE.
We had very promising results from initial experiments training 124M param GPT2 models from scratch (for next token prediction) with CPMMoEs--replacing all MLP's linear layers. The only caveat is that you probably want to use LayerNorm rather than BatchNorm1d when generating the expert coefficients, given the variable input length.
Hope this is helpful -- do let us know how you get on :)
Thanks so much for your detailed reply!
I still have a question about: Is it possible to load a pretrained mlp as a parameter and add it to CPMMoEs?
The MMoE layers do not support converting pre-trained MLPs' weights to MoEs, but that is a very interesting direction for future research!
Thanks!
Thanks for the great work! I have a question about :Is this kind of MoE suitable for autoregressive model?