microsoft / DeepSpeed-MII

MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Apache License 2.0
1.76k stars 163 forks source link

[REQUEST] Mixtral-8x22B support #474

Open y-live-koba opened 1 month ago

y-live-koba commented 1 month ago

Do you have plans to support Mixtral-8x22B?