amd / RyzenAI-SW

MIT License
361 stars 58 forks source link

opt-onnx import issue #105

Closed ksjogo closed 1 month ago

ksjogo commented 3 months ago

Following the onnx opt tutorial I run into trouble with a version mismatch of huggingface-hub and the transformers package:

 File "C:\Users\jogo\.conda\envs\ryzenai-transformers\lib\site-packages\transformers\utils\import_utils.py", line 1272, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "C:\Users\jogo\.conda\envs\ryzenai-transformers\lib\site-packages\transformers\utils\import_utils.py", line 1284, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (C:\Users\jogo\.conda\envs\ryzenai-transformers\lib\site-packages\huggingface_hub\__init__.py)
shivani-athavale commented 3 months ago

Hi @ksjogo,

I reproduced the error on my side and observed that pip install --upgrade huggingface_hub will help resolve this issue.

madcrow99 commented 2 months ago

Thanks! This does fix the issue but I got the warning beow.

image

That being said, the OPT1.3B pytorch example is running well.

uday610 commented 1 month ago

closing as the model is running well