huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
129.5k stars 25.71k forks source link

https://github.com/VikParuchuri/surya can not convert model to onnx #31384

Open qrsssh opened 1 month ago

qrsssh commented 1 month ago

System Info

@qubvel surya can not covert model to onnx Command Line: optimum-cli export onnx --model ./vikp/surya_order/ onnx/ --task vision2seq-lm

errors:

Framework not specified. Using pt to export the model.
Traceback (most recent call last):
  File "/opt/conda/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/opt/conda/lib/python3.10/site-packages/optimum/commands/export/onnx.py", line 265, in run
    main_export(
  File "/opt/conda/lib/python3.10/site-packages/optimum/exporters/onnx/__main__.py", line 280, in main_export
    model = TasksManager.get_model_from_task(
  File "/opt/conda/lib/python3.10/site-packages/optimum/exporters/tasks.py", line 1938, in get_model_from_task
    model = model_class.from_pretrained(model_name_or_path, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    return model_class.from_pretrained(
  File "/opt/conda/lib/python3.10/site-packages/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py", line 359, in from_pretrained
    return super().from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3677, in from_pretrained
    ) = cls._load_pretrained_model(
  File "/opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4155, in _load_pretrained_model
    raise RuntimeError(f"Error(s) in loading state_dict for {model.__class__.__name__}:\n\t{error_msg}")
RuntimeError: Error(s) in loading state_dict for VisionEncoderDecoderModel:

Who can help?

https://github.com/VikParuchuri/surya can not convert model to onnx

Information

Tasks

Reproduction

optimum-cli export onnx --model ./vikp/surya_order/ onnx/ --task vision2seq-lm

Expected behavior

use optimum-cli can not covert model to onnx Command Line: optimum-cli export onnx --model ./vikp/surya_order/ onnx/ --task vision2seq-lm

errors:

Framework not specified. Using pt to export the model.
Traceback (most recent call last):
  File "/opt/conda/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/opt/conda/lib/python3.10/site-packages/optimum/commands/export/onnx.py", line 265, in run
    main_export(
  File "/opt/conda/lib/python3.10/site-packages/optimum/exporters/onnx/__main__.py", line 280, in main_export
    model = TasksManager.get_model_from_task(
  File "/opt/conda/lib/python3.10/site-packages/optimum/exporters/tasks.py", line 1938, in get_model_from_task
    model = model_class.from_pretrained(model_name_or_path, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    return model_class.from_pretrained(
  File "/opt/conda/lib/python3.10/site-packages/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py", line 359, in from_pretrained
    return super().from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3677, in from_pretrained
    ) = cls._load_pretrained_model(
  File "/opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4155, in _load_pretrained_model
    raise RuntimeError(f"Error(s) in loading state_dict for {model.__class__.__name__}:\n\t{error_msg}")
RuntimeError: Error(s) in loading state_dict for VisionEncoderDecoderModel:
e
qubvel commented 1 month ago

Hi @qrsssh , thanks for the issue!

Could you please provide your environment versions of transformers, optimum? Has behavior changed? What was the last working version?

You can use:

transformers-cli env
xenova commented 1 month ago

Hi @qrsssh - those models are not officially supported by Optimum (or transformers), but you should be able to get it done by specifying a custom Optimum config. Check out https://huggingface.co/docs/optimum/en/exporters/onnx/usage_guides/export_a_model#customize-the-export-of-transformers-models-with-custom-modeling for more information.

qrsssh commented 1 month ago
transformers-cli env

thanks for your response

qrsssh commented 1 month ago

those models are not officially supported by Optimum (or transformers), but you should be able to get it done by specifying a custom Optimum config. Check out

thans your replay

Hi @qrsssh , thanks for the issue!

Could you please provide your environment versions of transformers, optimum? Has behavior changed? What was the last working version?

You can use:

transformers-cli env

addd other version nltk==3.8.1 numexpr==2.8.4 numpy==1.26.4 onnx==1.16.0 onnxruntime==1.17.3 onnxruntime-gpu==1.14.1 openapi-schema-pydantic==1.2.4 opencv-python==4.9.0.80 opencv-python-headless==4.9.0.80 optimum==1.19.2

qrsssh commented 1 month ago

Hi @qrsssh - those models are not officially supported by Optimum (or transformers), but you should be able to get it done by specifying a custom Optimum config. Check out https://huggingface.co/docs/optimum/en/exporters/onnx/usage_guides/export_a_model#customize-the-export-of-transformers-models-with-custom-modeling for more information.

thanks your response, I'm trying use customize-the-export-of-transformers-models-with-custom-modeling method

github-actions[bot] commented 6 days ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.