microsoft / DeepSpeed-MII

MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Apache License 2.0
1.84k stars 173 forks source link

Support for Fairseq Translation Model #74

Open huzerD opened 1 year ago

huzerD commented 1 year ago

Hi, does DeepSpeed-MII support fairseq's translation model, such as transformer.wmt16.en-de or transformer.wmt19.en-de? as no task translation listed in the Supported Models and Tasks section.

mrwyattii commented 1 year ago

@huzerD MII does not currently support this model, but we are adding support for more and more models. @jeffra is this model on our radar for support?