huggingface / optimum

🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
https://huggingface.co/docs/optimum/main/
Apache License 2.0
2.57k stars 470 forks source link

Cannot export TrOCR to ONNX with past #744

Closed mht-sharma closed 1 year ago

mht-sharma commented 1 year ago

System Info

- `optimum` version: 1.6.4.dev0
- `transformers` version: 4.26.0
- Platform: Linux-5.4.0-125-generic-x86_64-with-glibc2.17
- Python version: 3.8.15
- Huggingface_hub version: 0.11.0
- PyTorch version (GPU?): 1.13.1+cu117 (cuda availabe: True)
- Tensorflow version (GPU?): 2.10.1 (cuda availabe: False)

Who can help?

No response

Information

Tasks

Reproduction

python -m optimum.exporters.onnx --model="microsoft/trocr-small-handwritten" --task=vision2seq-lm-with-past model_trocr_base   --for-ort

Error:

ValueError: Exporting past key values is not supported with TrOCR model!

Expected behavior

Exporting TrOCR to ONNX with past should work.

fxmarty commented 1 year ago

Hi, this is fixed in https://github.com/huggingface/optimum/pull/1456

goodevile commented 1 year ago

I got it with:

!optimum-cli export onnx --task image-to-text --model microsoft/trocr-small-printed trocr-small-printed_onnx/

CrasCris commented 6 months ago

I got it with:

!optimum-cli export onnx --task image-to-text --model microsoft/trocr-small-printed trocr-small-printed_onnx/

Thats the Task of TrOCR?, i dont find any docs for that, how i can load then to use it only for inferention ?