If you attempt to run the server.py script outside the text-generation-webui directory, the --model argument will assume you're calling a remote model and attempts to download it from HF. Here's the full log:
$ python ~/pygmalion/text-generation-webui/server.py --model pyg-resharded --load-in-8bit --auto-devices --gpu-memory 6 --cai-chat
Warning: chat mode currently becomes somewhat slower with text streaming on.
Consider starting the web UI with the --no-stream option.
Loading pyg-resharded...
Traceback (most recent call last):
File "/home/alpindale/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 264, in hf_raise_for_status
response.raise_for_status()
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/models/pyg-resharded/resolve/main/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/transformers/utils/hub.py", line 410, in cached_file
resolved_file = hf_hub_download(
File "/home/alpindale/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 124, in _inner_fn
return fn(*args, **kwargs)
File "/home/alpindale/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1105, in hf_hub_download
metadata = get_hf_file_metadata(
File "/home/alpindale/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 124, in _inner_fn
return fn(*args, **kwargs)
File "/home/alpindale/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1440, in get_hf_file_metadata
hf_raise_for_status(r)
File "/home/alpindale/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 306, in hf_raise_for_status
raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-63ef82f5-12040a651ad8624d19eb5306)
Repository Not Found for url: https://huggingface.co/models/pyg-resharded/resolve/main/config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/alpindale/pygmalion/text-generation-webui/server.py", line 861, in <module>
model, tokenizer = load_model(model_name)
File "/home/alpindale/pygmalion/text-generation-webui/server.py", line 155, in load_model
model = eval(command)
File "<string>", line 1, in <module>
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 441, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 877, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/transformers/configuration_utils.py", line 573, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/transformers/configuration_utils.py", line 628, in _get_config_dict
resolved_config_file = cached_file(
File "/home/alpindale/.conda/envs/textgen/lib/python3.10/site-packages/transformers/utils/hub.py", line 425, in cached_file
raise EnvironmentError(
OSError: models/pyg-resharded is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.
If you attempt to run the
server.py
script outside thetext-generation-webui
directory, the--model
argument will assume you're calling a remote model and attempts to download it from HF. Here's the full log: