Open qrsssh opened 1 month ago
Hi @qrsssh , thanks for the issue!
Could you please provide your environment versions of transformers, optimum? Has behavior changed? What was the last working version?
You can use:
transformers-cli env
Hi @qrsssh - those models are not officially supported by Optimum (or transformers), but you should be able to get it done by specifying a custom Optimum config. Check out https://huggingface.co/docs/optimum/en/exporters/onnx/usage_guides/export_a_model#customize-the-export-of-transformers-models-with-custom-modeling for more information.
transformers-cli env
thanks for your response
transformers
version: 4.40.2those models are not officially supported by Optimum (or transformers), but you should be able to get it done by specifying a custom Optimum config. Check out
thans your replay
Hi @qrsssh , thanks for the issue!
Could you please provide your environment versions of transformers, optimum? Has behavior changed? What was the last working version?
You can use:
transformers-cli env
addd other version nltk==3.8.1 numexpr==2.8.4 numpy==1.26.4 onnx==1.16.0 onnxruntime==1.17.3 onnxruntime-gpu==1.14.1 openapi-schema-pydantic==1.2.4 opencv-python==4.9.0.80 opencv-python-headless==4.9.0.80 optimum==1.19.2
Hi @qrsssh - those models are not officially supported by Optimum (or transformers), but you should be able to get it done by specifying a custom Optimum config. Check out https://huggingface.co/docs/optimum/en/exporters/onnx/usage_guides/export_a_model#customize-the-export-of-transformers-models-with-custom-modeling for more information.
thanks your response, I'm trying use customize-the-export-of-transformers-models-with-custom-modeling method
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
System Info
@qubvel surya can not covert model to onnx Command Line: optimum-cli export onnx --model ./vikp/surya_order/ onnx/ --task vision2seq-lm
errors:
Who can help?
https://github.com/VikParuchuri/surya can not convert model to onnx
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
optimum-cli export onnx --model ./vikp/surya_order/ onnx/ --task vision2seq-lm
Expected behavior
use optimum-cli can not covert model to onnx Command Line: optimum-cli export onnx --model ./vikp/surya_order/ onnx/ --task vision2seq-lm
errors: