Closed ZTurboX closed 4 months ago
I fixed the bug and created a PR.
PR: https://github.com/huggingface/optimum/pull/1674
I tested it locally with the changes on your model and it works.
Full example notebook: https://github.com/satishsilveri/Semantic-Search/blob/main/Optimize_SBERT/BAAI_bge_base_zh.ipynb
@fxmarty
If I update to optimum==1.21.2
from optimum==1.16.2
I get back the old error:
from optimum.onnxruntime import ORTModelForFeatureExtraction
File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
File "/home/coder/.local/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1550, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/coder/.local/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1562, in _get_module
raise RuntimeError(
RuntimeError: Failed to import optimum.onnxruntime.modeling_ort because of the following error (look up to see its traceback):
Failed to import optimum.exporters.onnx.__main__ because of the following error (look up to see its traceback):
cannot import name 'is_torch_less_than_1_11' from 'transformers.pytorch_utils' (/home/coder/.local/lib/python3.8/site-packages/transformers/pytorch_utils.py)
System Info
Who can help?
@michaelbenayoun @JingyaHuang @echarlaix
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
export model and predict code ` pretrained_model_path = './checkpoints/bge-base-zh' export_model_path = './checkpoints/onnx'
`
Expected behavior
use bge-base-zh model than have two errors