huggingface / optimum

🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
https://huggingface.co/docs/optimum/main/
Apache License 2.0
2.46k stars 436 forks source link

can not import ORTModelForCausalLM from optimum.onnxruntime #1177

Closed phamkhactu closed 1 year ago

phamkhactu commented 1 year ago

System Info

- python: 3.8.2
- optimum: 1.9.1
- onnx: 1.14

Reproduction

import torch from peft import PeftModel from transformers import AutoTokenizer, AutoModelForCausalLM from prompt import make_prompt from optimum.onnxruntime import ORTModelForCausalLM

model = RTModelForCausalLM.from_pretrained( "merge_base_lora", use_io_binding=True, export=True, use_cache=True, from_transformers=True, )

logs

Traceback (most recent call last):
  File "convert_to_onnx.py", line 5, in <module>
    from optimum.onnxruntime import ORTModelForCausalLM
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/home/tupk/anaconda3/envs/nlp/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1081, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/tupk/anaconda3/envs/nlp/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1093, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import optimum.onnxruntime.modeling_decoder because of the following error (look up to see its traceback):
cannot import name 'ORTModelForCausalLM' from 'optimum.onnxruntime'

Expected behavior

Thanks for great repo.

I've encountered with problem when I've convert model to onnx. I don't know why exception appears. Thank you for your helping.

regisss commented 1 year ago

Hi @phamkhactu, I think this happens because you don't have onnxruntime installed. Could you run

pip install --upgrade optimum[onnxruntime]

and let me know if that solves your issue please?

phamkhactu commented 1 year ago

Hi @phamkhactu, I think this happens because you don't have onnxruntime installed. Could you run

pip install --upgrade optimum[onnxruntime]

and let me know if that solves your issue please?

Hi @regisss

I've followed your guide, but I still face the same error, even I've create new env.

regisss commented 1 year ago

What are the versions of transformers and onnxruntime you are running? You can get them with

pip show transformers onnxruntime
phamkhactu commented 1 year ago

What are the versions of transformers and onnxruntime you are running? You can get them with

pip show transformers onnxruntime

yes, I can get. Here is my log:

Name: transformers
Version: 4.31.0.dev0
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: /home/tupk/anaconda3/envs/nlp/lib/python3.8/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: audiolm-pytorch, optimum, peft, suno-bark, TTS
---
Name: onnxruntime
Version: 1.14.1
Summary: ONNX Runtime is a runtime accelerator for Machine Learning models
Home-page: https://onnxruntime.ai
Author: Microsoft Corporation
Author-email: onnxruntime@microsoft.com
License: MIT License
Location: /home/tupk/anaconda3/envs/nlp/lib/python3.8/site-packages
Requires: coloredlogs, flatbuffers, numpy, packaging, protobuf, sympy
Required-by: invisible-watermark
regisss commented 1 year ago

I cannot reproduce this error. Maybe you can try using the last release of Transformers and the dev version, and upgrade OnnxRuntime also.

hieupth commented 1 year ago

I have a same problem

Name: transformers
Version: 4.31.0
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: /home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: optimum

Name: onnxruntime
Version: 1.15.1
Summary: ONNX Runtime is a runtime accelerator for Machine Learning models
Home-page: https://onnxruntime.ai
Author: Microsoft Corporation
Author-email: onnxruntime@microsoft.com
License: MIT License
Location: /home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages
Requires: coloredlogs, flatbuffers, numpy, packaging, protobuf, sympy
Required-by: 

My test file is a simple .py file

from optimum.onnxruntime import ORTOptimizer
from optimum.onnxruntime.configuration import OptimizationConfig

And here is the error log

Traceback (most recent call last):
  File "/home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1099, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/home/hieupth/.conda/envs/onnx/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages/optimum/onnxruntime/optimization.py", line 21, in <module>
    import onnx
  File "/home/hieupth/Projects/stableml/simplefaq/onnx.py", line 1, in <module>
    from optimum.onnxruntime import ORTOptimizer
ImportError: cannot import name 'ORTOptimizer' from 'optimum.onnxruntime' (/home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages/optimum/onnxruntime/__init__.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/hieupth/Projects/stableml/simplefaq/onnx.py", line 1, in <module>
    from optimum.onnxruntime import ORTOptimizer
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1089, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1101, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import optimum.onnxruntime.optimization because of the following error (look up to see its traceback):
cannot import name 'ORTOptimizer' from 'optimum.onnxruntime' (/home/hieupth/.conda/envs/onnx/lib/python3.10/site-packages/optimum/onnxruntime/__init__.py)

If I run in notebook, the error is gone. Don't know why but it only happen when I run code from .py file

regisss commented 1 year ago

@hieupth I see the path to your script is /home/hieupth/Projects/stableml/simplefaq/onnx.py. Not sure if it will help, but could you try changing the name from onnx.py to another name that is not the same as one of the imported packages in Optimum?

phamkhactu commented 1 year ago

@hieupth Have you ever solved it?

hieupth commented 1 year ago

@phamkhactu @regisss I solved it. Filename onnx.py or convert_to_onnx.py will cause the problem. Rename it to blabla.py solved the problem

phamkhactu commented 1 year ago

Hi @regisss

I have checked it again optimum/optimum/onnxruntime/__init__.py, from typing import TYPE_CHECKING I've seen that TYPE_CHECKING = False . So that can not import class. from line 87.

if TYPE_CHECKING:
    from .configuration import ORTConfig, QuantizationConfig
    from .modeling_decoder import ORTModelForCausalLM
    from .modeling_ort import (
        ORTModel,
        ORTModelForAudioClassification,
        ORTModelForAudioFrameClassification,
        ORTModelForAudioXVector,
        ORTModelForCTC,
        ORTModelForCustomTasks,
        ORTModelForFeatureExtraction,
        ORTModelForImageClassification,
        ORTModelForMaskedLM,
        ORTModelForMultipleChoice,
        ORTModelForQuestionAnswering,
        ORTModelForSemanticSegmentation,
        ORTModelForSequenceClassification,
        ORTModelForTokenClassification,
    )
    from .modeling_seq2seq import ORTModelForSeq2SeqLM, ORTModelForSpeechSeq2Seq
    from .optimization import ORTOptimizer
    from .quantization import ORTQuantizer
    from .trainer import ORTTrainer
    from .trainer_seq2seq import ORTSeq2SeqTrainer
    from .training_args import ORTTrainingArguments
    from .training_args_seq2seq import ORTSeq2SeqTrainingArguments
    from .utils import (
        ONNX_DECODER_MERGED_NAME,
        ONNX_DECODER_NAME,
        ONNX_DECODER_WITH_PAST_NAME,
        ONNX_ENCODER_NAME,
        ONNX_WEIGHTS_NAME,
        ORTQuantizableOperator,
    )

    try:
        if not is_diffusers_available():
            raise OptionalDependencyNotAvailable()
    except OptionalDependencyNotAvailable:
        from ..utils.dummy_diffusers_objects import (
            ORTStableDiffusionImg2ImgPipeline,
            ORTStableDiffusionInpaintPipeline,
            ORTStableDiffusionPipeline,
            ORTStableDiffusionXLImg2ImgPipeline,
            ORTStableDiffusionXLPipeline,
        )
    else:
        from .modeling_diffusion import (
            ORTStableDiffusionImg2ImgPipeline,
            ORTStableDiffusionInpaintPipeline,
            ORTStableDiffusionPipeline,
            ORTStableDiffusionXLImg2ImgPipeline,
            ORTStableDiffusionXLPipeline,
        )
else:
    import sys

    sys.modules[__name__] = _LazyModule(__name__, globals()["__file__"], _import_structure, module_spec=__spec__)

I set TYPE_CHECKING = True, but some partially can not import because of some files still use if TYPE_CHECKING:......

I use export TYPE_CHECKING = True, however anything is affected. Could you check again and give me some advance ? . My python version is 3.8.13 Thank you

phamkhactu commented 1 year ago

@regisss I've found my name of file "convert_to_onnx", it is the same with import transformer, so it causes error. One more time, thank you!!

gidzr commented 11 months ago

Holy donkey ass mules batman.. I freakin deleted and fresh installed my server and environment 5 times, uninstalling-reinstalling the a dozen times with different versions, combed through all the existing and previous documentation for onnx/runtime/exporting/optimizations.. even tried bing gpt4 a dozen times.

and this ---> changing my onnx.py to "ANYTHING-ELSE.py" than onnx.py solved the issue

@phamkhactu @regisss @hieupth THANK YOU!!!!!!!!