Open gtx-cyber opened 6 months ago
@fxmarty
Please assist in this, it was essential
It would be nice, I'm assuming we need to ensure both siglip and mistral 8b are both supported in onnx first
@amyeroberts Please have a look, A guidance on how I can proceed with the conversion of a custom model would help, I tried referring to the guide but wasnt too clear for a model like this- Multimodal
Hi @gtx-cyber - thanks for your interesting in making this model onnx exportable! As @matbee-eth mentions, the first steps would be to make sure siglip and mistral 8b are exportable.
Feature request
With reference to the new Idefics2 model- https://huggingface.co/HuggingFaceM4/idefics2-8b I would like to export it to ONNX which is currently not possible. Please enable conversion support. Current Error with pip install transformers via GIT
Motivation
The model is good and would like to export it to onnx asap
Your contribution
-