mistralai / mistral-inference

Official inference library for Mistral models
https://mistral.ai/
Apache License 2.0
9.37k stars 817 forks source link

how to finetune the mistral-moe with expert/data/pipeline parallel? #114

Open marsggbo opened 6 months ago

marsggbo commented 6 months ago

it seems that the provided code is based on a single GPU. Any tutorials for finetuning mistral-moe with expert/data/pipeline parallel?