aida-ugent / SkillGPT

Other
55 stars 11 forks source link

Unauthorized for huggingface.co #2

Open dmitrydrynov opened 1 year ago

dmitrydrynov commented 1 year ago

How can I fix this error?

INFO:root:Loading the model vicuna-13b ...
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 261, in hf_raise_for_status
    response.raise_for_status()
  File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/models/vicuna_13b/resolve/main/tokenizer_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/transformers/utils/hub.py", line 409, in cached_file
    resolved_file = hf_hub_download(
  File "/usr/local/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn

    return fn(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1195, in hf_hub_download
    metadata = get_hf_file_metadata(
  File "/usr/local/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1541, in get_hf_file_metadata
    hf_raise_for_status(r)
  File "/usr/local/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 293, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-64b9ccb0-729e6f24089257533a42004e;293f4046-87c9-4afd-a698-82d4729b137e)

Repository Not Found for url: https://huggingface.co/models/vicuna_13b/resolve/main/tokenizer_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/usr/src/app/api.py", line 120, in <module>
    skill_gpt = SkillGPT(args.model_path,
  File "/usr/src/app/skillgpt.py", line 52, in __init__
    self.tokenizer, self.model, self.context_len = load_model(model_path, num_gpus)
  File "/usr/src/app/skillgpt.py", line 28, in load_model
    tokenizer = AutoTokenizer.from_pretrained(model_path)
  File "/usr/local/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 642, in from_pretrained

    tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 486, in get_tokenizer_config
    resolved_config_file = cached_file(
  File "/usr/local/lib/python3.9/site-packages/transformers/utils/hub.py", line 424, in cached_file
    raise EnvironmentError(
OSError: models/vicuna_13b is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.
NanLi2021 commented 1 year ago

Hi, you can download the model weights of the latest vicuna-13b here, and save them to your local folder models/vicuna_13b as defined in the .env file (or change it to whatever folder you like). Thanks!

dmitrydrynov commented 1 year ago

Hi, you can download the model weights of the latest vicuna-13b here, and save them to your local folder models/vicuna_13b as defined in the .env file (or change it to whatever folder you like). Thanks!

Thank you. Can I use other models? For example lmsys/fastchat-t5-3b-v1.0? I need a solution for commercial use.

NanLi2021 commented 1 year ago

Hi, as our LICENSE clearly states, our software is NOT allowed for commercial use.

Of course, we would be happy to discuss possible commercial use, and an alternative license to allow for that. But to do that we'd need to involve the Technology Transfer office from Ghent University. Please let us know if you wish to do that, and we will be happy to get that process started.

Thanks for your interest.