Open pengpengtao opened 3 months ago
I hope I can defer to our ONNX specialist @xenova :smile:
Hi @pengpengtao 👋 Which model are you trying to run? In general, yes, you can infer the model separately through encoder.onnx and decoder.onnx, with outputs from the encoder being used as inputs to the decoder (usually).
Hi @pengpengtao 👋 Which model are you trying to run? In general, yes, you can infer the model separately through encoder.onnx and decoder.onnx, with outputs from the encoder being used as inputs to the decoder (usually).
hello,The model's address is https://huggingface.co/Xenova/nllb-200-distilled-600M/tree/main/onnx。I don't know how to load encode.onnx and decoder.onnx, and successfully translate a sentence into another language. Can you help me write an inference code to achieve the translation effect through the encoder and decoder? thank you
@xenova ,hello,Can you help me? The model's address is [https://huggingface.co/Xenova/nllb-200-distilled-600M/tree/main/onnx]。I don't know how to load encode.onnx and decoder.onnx, and successfully translate a sentence into another language. Can you help me write an inference code to achieve the translation effect through the encoder and decoder? thank you
Have you made any progress and how to reason
Feature request
Is it possible to infer the model separately through encoder.onnx and decoder.onnx
Motivation
Is it possible to infer the model separately through encoder.onnx and decoder.onnx
Your contribution
Is it possible to infer the model separately through encoder.onnx and decoder.onnx