KoboldAI / KoboldAI-Client

https://koboldai.com
GNU Affero General Public License v3.0
3.45k stars 743 forks source link

[SOLVED] split_torch_state_dict_into_shards #446

Open freeload101 opened 1 month ago

freeload101 commented 1 month ago
bash -x play.sh
ESC[?2004l^M+ '[' '!' -f runtime/envs/koboldai/bin/python ']'
+ bin/micromamba run -r runtime -n koboldai python aiserver.py
Traceback (most recent call last):
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1076, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/modeling_utils.py", line 78, in <module>
    from accelerate import __version__ as accelerate_version
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/accelerate/__init__.py", line 16, in <module>
    from .accelerator import Accelerator
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/accelerate/accelerator.py", line 34, in <module>
    from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/huggingface_hub/__init__.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "aiserver.py", line 58, in <module>
    from utils import debounce
  File "/opt/koboldai-client/utils.py", line 12, in <module>
    from transformers import PreTrainedModel
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1066, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1078, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/huggingface_hub/__init__.py)

FIX: ./bin/micromamba run -r runtime -n koboldai pip install --upgrade huggingface_hub

crspangenberg commented 1 month ago

Where should this command be run?

GeorgeEBeresford commented 6 days ago

Where should this command be run?

I'm not sure about the command he mentioned. I went down a similar path.

1) Open command prompt 2) Navigate to the directory with KoboldAI installed via CD (e.g. CD C:\Program Files (x86)\KoboldAI) 3) Run miniconda3\condabin\activate That will run command prompt with the miniconda context 4) type pip install --upgrade huggingface_hub

This fixed the issue for me

Mat4Shell commented 6 days ago

Or you can try this :

  1. Launch commandline.bat or commandline.sh (depending of your OS)
  2. Execute pip install --upgrade huggingface_hub
  3. Relaunch KoboldAI

I try this one and it works for me because I don't have a condabin folder in my miniconda3 folder