huggingface / optimum-neuron

Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.
Apache License 2.0
196 stars 59 forks source link

Support Marian models inference #514

Open JingyaHuang opened 6 months ago

JingyaHuang commented 6 months ago

Feature request

As requested in aws neuron caching, we must support Marian models export and inference first.

Tasks: feature-extraction / text2text-generation

Motivation

Requested by users in aws-neuron/optimum-neuron-cache.

Your contribution

The support should be similar to what we have done with t5. It's not in my priority list, would be nice if there are community members who want to pick the task. I can assist.