anarchy-ai / LLM-VM

irresponsible innovation. Try now at https://chat.dev/
https://anarchy.ai/
MIT License
465 stars 150 forks source link

README Llama2 Example Errors Requesting HuggingFace API Key #423

Open collinarnett opened 6 months ago

collinarnett commented 6 months ago

Describe the bug When initializing the client using llama2 as the big_model argument as specified in the "Running LLM's Locally" section of the README HuggingFace throws:

Repo model meta-llama/Llama-2-7b-hf is gated. You must be authenticated to access it.

To Reproduce Steps to reproduce the behavior:

  1. Install LLM-VM
  2. Running
    
    # import our client
    from llm_vm.client import Client

Select the LlaMA 2 model

client = Client(big_model = 'llama2')


**Expected behavior**
Model is downloaded and client is initialized.

**Screenshots**

client = Client(big_model = 'llama2') Traceback (most recent call last): File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 269, in hf_raise_for_status response.raise_for_status() File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-2-7b-hf/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/transformers/utils/hub.py", line 430, in cached_file resolved_file = hf_hub_download( File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, *kwargs) File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1360, in hf_hub_download raise head_call_error File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1233, in hf_hub_download metadata = get_hf_file_metadata( File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(args, **kwargs) File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1622, in get_hf_file_metadata hf_raise_for_status(r) File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 285, in hf_raise_for_status raise GatedRepoError(message, response) from e huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-65846a42-5d524e591835b68578f43fec;8085e1d7-664a-4f33-be03-53c8a5fc43d5)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-2-7b-hf/resolve/main/config.json. Repo model meta-llama/Llama-2-7b-hf is gated. You must be authenticated to access it.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 1, in File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/llm_vm/client.py", line 70, in init self.teacher = load_model_closure(big_model)(big_model_config) File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/llm_vm/onsite_llm.py", line 99, in init self.model=self.model_loader(model_kw_args) File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/llm_vm/onsite_llm.py", line 607, in model_loader return LlamaForCausalLM.from_pretrained(self.model_uri) File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2600, in from_pretrained resolved_config_file = cached_file( File "/nix/store/ryhyfwvxiws6jd444kpm3zxr31yrz75c-python3-3.10.13-env/lib/python3.10/site-packages/transformers/utils/hub.py", line 445, in cached_file raise EnvironmentError( OSError: You are trying to access a gated repo. Make sure to request access at https://huggingface.co/meta-llama/Llama-2-7b-hf and pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>.



**Desktop (please complete the following information):**
 - system: `"x86_64-linux"`
 - host os: `Linux 6.1.65, NixOS, 24.05 (Uakari), 24.05.20231204.2c7f3c0`
 - multi-user?: `yes`
 - sandbox: `yes`
 - version: `nix-env (Nix) 2.19.2`
 - channels(collin): `""`
 - channels(root): `"nixos-21.11.335130.386234e2a61"`
 - nixpkgs: `/nix/store/aiv01710wqn2b7hms2253d1cq89kdzh8-source`

**Additional context**
Add any other context about the problem here.

**Suggested Fix**
Add a HuggingFace token argument similar to the OpenAI API key.
wansatya commented 5 months ago

Maybe this can help?

https://github.com/anarchy-ai/LLM-VM/issues/429#issuecomment-1876669796