oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
40.48k stars 5.31k forks source link

504 Server Error Gateway Time-out for url when downloading a model #1939

Closed leszekhanusz closed 1 year ago

leszekhanusz commented 1 year ago

Describe the bug

I tried to download the TheBloke/WizardLM-7B-uncensored-GPTQ model but I receive a 504 error code.

I'm not sure if I'm doing something wrong or if it's down for everybody. The HuggingFace status page seems to indicate that everything is running smoothly for now.

Is there an existing issue for this?

Reproduction

$ python download-model.py TheBloke/WizardLM-7B-uncensored-GPTQ
Traceback (most recent call last):
  File "/mnt/tera/git-repos/text-generation-webui/download-model.py", line 267, in <module>
    links, sha256, is_lora = get_download_links_from_huggingface(model, branch, text_only=args.text_only)
  File "/mnt/tera/git-repos/text-generation-webui/download-model.py", line 102, in get_download_links_from_huggingface
    r.raise_for_status()
  File "/mnt/tera/miniconda3/envs/textgen/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/models/TheBloke/WizardLM-7B-uncensored-GPTQ/tree/main

Screenshot

No response

Logs

N/A

System Info

N/A
Cobrabb commented 1 year ago

Same issue here. I'm also having the same issue if I select one of the defaults in the list that's presented instead of manually specifying a HuggingFace.

Cobrabb commented 1 year ago

It seems like maybe the huggingface API is having trouble, because I tried to access "https://huggingface.co/api/models/facebook/opt-6.7b/tree/main" in the browser and it similarly timed out and gave a 504 error. Probably not an issue with this tool.

Cobrabb commented 1 year ago

Does anyone have a good guide for how to manually install models? I can see where the tool tells me to put them but I'm not sure what files I should put there and in what structure. With stable diffusion it's a single model file but it seems like these models are composed of multiple files. Do I put the whole folder?

tandpastatester commented 1 year ago

I'm having this issue since this morning. Either the API is unreachable or some address has changed. HF status says its up, but the models seem to be unaccessible through their API url.

@Cobrabb You only need the json and txt files, and the model file (the large .bin or .safetensors) put them inside the /models folder in the main dir of ooba. Usually the safetensors Or you can try git clone https://huggingface.co/api/models/facebook/opt-6.7b inside your /models folder.

Baael commented 1 year ago

both hf and git clone are giving me 504, github seems to have a lot of problems today too. I am able to download, by entering to file and clicking download on "This file is stored with Git LFS . It is too big to display, but you can still [download] it."

leszekhanusz commented 1 year ago

It was fixed.