ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://parisneo.github.io/lollms-webui/
Apache License 2.0
4.07k stars 515 forks source link

Cannot use HuggingFace models #516

Open derritter88 opened 3 months ago

derritter88 commented 3 months ago

Expected Behavior

Download a model from Hugging Face and use it.

Current Behavior

I do get following error message: offload_weight() takes from 3 to 4 positional arguments but 5 were given within browser. Console output see below.

Steps to Reproduce

Please provide detailed steps to reproduce the issue.

  1. Install hugging-face binding
  2. Restart LoLLMS
  3. Download a model like https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B
  4. Click on the model to enable it and error.

Console log

Requested updating of setting model_name to Nous-Hermes-2-Mistral-7B-DPO
Changing model to: Nous-Hermes-2-Mistral-7B-DPO
Building model
Nous-Hermes-2-Mistral-7B-DPO
*-*-*-*-*-*-*-*
Cuda VRAM usage
*-*-*-*-*-*-*-*
{'nb_gpus': 1, 'gpu_0_total_vram': 12878610432, 'gpu_0_used_vram': 2595225600, 'gpu_0_model': 'NVIDIA GeForce RTX 4070 Ti'}
Cleared cache
*-*-*-*-*-*-*-*
Cuda VRAM usage
*-*-*-*-*-*-*-*
{'nb_gpus': 1, 'gpu_0_total_vram': 12878610432, 'gpu_0_used_vram': 2595225600, 'gpu_0_model': 'NVIDIA GeForce RTX 4070 Ti'}
Creating tokenizer C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Recovering generation config C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Creating model C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Using device map: auto
Loading checkpoint shards:  33%|█████████████████████████████████████████████████████████▋                                                                                                                   | 1/3 [00:06<00:13,  6.74s/it]
Traceback (most recent call last):
  File "C:\Users\mmuehlbacher\lollms\lollms-webui\zoos\bindings_zoo\hugging_face\__init__.py", line 266, in build_model
    self.model:AutoModelForCausalLM = AutoModelForCausalLM.from_pretrained(str(model_path),
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\models\auto\auto_factory.py", line 561, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\modeling_utils.py", line 3502, in from_pretrained
    ) = cls._load_pretrained_model(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\modeling_utils.py", line 3926, in _load_pretrained_model
    new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
                                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\modeling_utils.py", line 798, in _load_state_dict_into_meta_model
    state_dict_index = offload_weight(param, param_name, model, state_dict_folder, state_dict_index)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: offload_weight() takes from 3 to 4 positional arguments but 5 were given

Couldn't load the model C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Here is the error encountered during loading:
offload_weight() takes from 3 to 4 positional arguments but 5 were given
derritter88 commented 3 months ago

Additional information: I am using the 9.4 version

derritter88 commented 3 months ago

The issue occurs at Windows and within WSL.

Rainmanqxy commented 3 months ago

Exactly the same issue. I tried different models downloaded from huggingface (incl. gguf), none of them seemed to work.