Open gary149 opened 3 months ago
hmm not include the token, but mention to do huggingface-cli login
instead
cc @osanseviero
I agree with a mention to huggingface-cli login
and btw the error message if not logged in should already prompt you to huggingface-cli login
and btw the error message if not logged in should already prompt you to huggingface-cli login
let me know, I can create an issue on transformers
btw it should work with diffusers too
i meant it must already be the case AFAIK
and yes it's not library specific
on transformers:
/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
warnings.warn(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
401 Client Error. (Request ID: Root=1-6672b9e2-4e7561551ca6a7d17bd1e5a8;718527cf-0693-4b8f-ad60-8a60fb2cf76c)
Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.
on diffusers:
HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/models/stabilityai/stable-diffusion-3-medium
The above exception was the direct cause of the following exception:
GatedRepoError Traceback (most recent call last)
GatedRepoError: 401 Client Error. (Request ID: Root=1-6672ba64-6d0027fd0aff7b1b00e1e485;ea7fdd17-c3a6-4c4f-b0a2-e6d37f3b5a6f)
Cannot access gated repo for url https://huggingface.co/api/models/stabilityai/stable-diffusion-3-medium.
Access to model stabilityai/stable-diffusion-3-medium is restricted. You must be authenticated to access it.
The above exception was the direct cause of the following exception:
OSError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/diffusers/pipelines/pipeline_utils.py](https://localhost:8080/#) in download(cls, pretrained_model_name, **kwargs)
1546 else:
1547 # 2. we forced `local_files_only=True` when `model_info` failed
-> 1548 raise EnvironmentError(
1549 f"Cannot load model {pretrained_model_name}: model is not cached locally and an error occurred"
1550 " while trying to fetch metadata from the Hub. Please check out the root cause in the stacktrace"
OSError: Cannot load model stabilityai/stable-diffusion-3-medium: model is not cached locally and an error occurred while trying to fetch metadata from the Hub. Please check out the root cause in the stacktrace above.
Ok, so maybe we add Please log in using "huggingface-cli login" or similar mechanism
to that error message:
Access to model stabilityai/stable-diffusion-3-medium is restricted. You must be authenticated to access it. then?
(i think that error string is in moon)
but orthogonally, i'm ok to add a huggingface-cli login
command to the snippets as discussed in https://github.com/huggingface/huggingface.js/issues/765#issuecomment-2178338707
oh and btw maybe let's also start to add brew install huggingface-cli
on the line before as a good easy way to install it?
(cc @Wauplin too)
EDIT: what's an easy alternative for Windows?
pip install huggingface_hub
or pip install --upgrade huggingface_hub
would be the only way AFAIK for Windows atm.
Ah yeah here is the authentication string prompt: https://github.com/huggingface/moon-landing/blob/19a45f022a771c9f1b045c95e072122982513514/server/lib/Auth.ts#L930
For windows I can help on having a winget
install path https://github.com/microsoft/winget-cli
would it be winget install huggingface-cli
?
Tentatively yes - this or HuggingFace.Cli
I think there is a short and long reference(s)
@mfuntowicz I'm not knowledgeable on winget
specifically but please let me know if I can be of any assistance packaging huggingface-cli
:)
For gated models add a comment on how to create the token + update the code snippet to include the token (edit: as a placeholder)