xrsrke / pipegoose

Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
MIT License
76 stars 17 forks source link

[Feature] Add retrieving auxiliary and Z losses from ExpertLoss #53

Closed xrsrke closed 9 months ago