Hello! NLLB-200 54b (MOE) is not supported via the Transformers converter. Is it supported via the FairSeq converter? Smaller versions of NLLB work fine with the Transformers converter. I am asking because I would have to arrange for extra space and computation beforehand. Thanks!
The NLLB MOE version uses a different model architecture which is not implemented in CTranslate2. Currently there is no plan to support the MOE version.
Hello! NLLB-200 54b (MOE) is not supported via the Transformers converter. Is it supported via the FairSeq converter? Smaller versions of NLLB work fine with the Transformers converter. I am asking because I would have to arrange for extra space and computation beforehand. Thanks!