microsoft / DeepSpeed-MII

MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Apache License 2.0
1.91k stars 175 forks source link

multi model deployment #525

Open whcjb opened 2 months ago

whcjb commented 2 months ago

hello, can i once deploy several models for server ?

whcjb commented 2 months ago

i saw https://github.com/microsoft/DeepSpeed-MII/pull/223 this can support multi-model deployment, but the latest code has not the function, so is mill possible to support multi-models now ?