huggingface / optimum

🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
https://huggingface.co/docs/optimum/main/
Apache License 2.0
2.55k stars 458 forks source link

Idefics2 Support in Optimum for ONNX export #1821

Open gtx-cyber opened 6 months ago

gtx-cyber commented 6 months ago

Feature request

With reference to the new Idefics2 model- https://huggingface.co/HuggingFaceM4/idefics2-8b I would like to export it to ONNX which is currently not possible. Please enable conversion support. Current Error with pip install transformers via GIT

Traceback (most recent call last):
  File "/usr/local/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/dist-packages/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/usr/local/lib/python3.10/dist-packages/optimum/commands/export/onnx.py", line 265, in run
    main_export(
  File "/usr/local/lib/python3.10/dist-packages/optimum/exporters/onnx/__main__.py", line 352, in main_export
    onnx_export_from_model(
  File "/usr/local/lib/python3.10/dist-packages/optimum/exporters/onnx/convert.py", line 1048, in onnx_export_from_model
    raise ValueError(
ValueError: Trying to export a idefics2 model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type idefics2 to be supported natively in the ONNX export.

Motivation

The model is good and would like to export it to onnx asap

Your contribution

-

gtx-cyber commented 6 months ago

@fxmarty

gtx-cyber commented 6 months ago

Please assist in this, it was essential

matbee-eth commented 6 months ago

It would be nice, I'm assuming we need to ensure both siglip and mistral 8b are both supported in onnx first

gtx-cyber commented 5 months ago

@amyeroberts Please have a look, A guidance on how I can proceed with the conversion of a custom model would help, I tried referring to the guide but wasnt too clear for a model like this- Multimodal

amyeroberts commented 5 months ago

Hi @gtx-cyber - thanks for your interesting in making this model onnx exportable! As @matbee-eth mentions, the first steps would be to make sure siglip and mistral 8b are exportable.