AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
139.16k stars 26.41k forks source link

[Bug]: Exception: Error while deserializing header: HeaderTooLarge #8123

Open nikkwong opened 1 year ago

nikkwong commented 1 year ago

Is there an existing issue for this?

What happened?

When using controlnet with any models, inference fails with the error:

Error running process: /Users/nikk/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py
Traceback (most recent call last):
  File "/Users/nikk/stable-diffusion-webui/modules/scripts.py", line 386, in process
    script.process(p, *script_args)
  File "/Users/nikk/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py", line 571, in process
    else self.build_control_model(p, unet, model, lowvram)
  File "/Users/nikk/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py", line 433, in build_control_model
    state_dict = load_state_dict(model_path)
  File "/Users/nikk/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/utils.py", line 9, in load_state_dict
    state_dict = safetensors.torch.load_file(ckpt_path, device=location)
  File "/Users/nikk/stable-diffusion-webui/venv/lib/python3.10/site-packages/safetensors/torch.py", line 98, in load_file
    with safe_open(filename, framework="pt", device=device) as f:
Exception: Error while deserializing header: HeaderTooLarge

Steps to reproduce the problem

  1. Download controlnet
  2. Use any preprocessor/model
  3. Try inference, fails.

What should have happened?

Inference shoul work correctly

Commit where the problem happens

5a1b62e9f8048e20a9ff47df73b16f8a0b5e673c

What platforms do you use to access the UI ?

Windows, MacOS

What browsers do you use to access the UI ?

Mozilla Firefox, Google Chrome

Command Line Arguments

N/a

List of extensions

Controlnet

Console logs

No logs generated except svelte warnings.

Additional information

Tried on both a clean install on osx + windows.

Nakigatsuran commented 1 year ago

Same issue here.

derekhsu commented 1 year ago

You've used a wrong model file. Download again and check you model file.

Nakigatsuran commented 1 year ago

Thanks. It works after changing the filename extension from .safetensors to .ckpt.

nikkwong commented 1 year ago

Which models should be *.ckpt? The files in /extensions/sd-webui-controlnet/models/*? I've tried to blindly change some of them from .safetensor to .ckpt and I get:

Loading model: control_openpose-fp16 [e3b0c442]
Error verifying pickled file from /Users/nikk/stable-diffusion-webui/extensions/sd-webui-controlnet/models/control_openpose-fp16.ckpt:
Traceback (most recent call last):
  File "/Users/nikk/stable-diffusion-webui/modules/safe.py", line 81, in check_pt
    with zipfile.ZipFile(filename) as z:
  File "/opt/homebrew/Cellar/python@3.10/3.10.9/Frameworks/Python.framework/Versions/3.10/lib/python3.10/zipfile.py", line 1267, in __init__
    self._RealGetContents()
  File "/opt/homebrew/Cellar/python@3.10/3.10.9/Frameworks/Python.framework/Versions/3.10/lib/python3.10/zipfile.py", line 1334, in _RealGetContents
    raise BadZipFile("File is not a zip file")
zipfile.BadZipFile: File is not a zip file

When I change the models used for inference i.e. models/Stable-Diffusion/ I get:

loading stable diffusion model: AttributeError
Traceback (most recent call last):
  File "/Users/nikk/stable-diffusion-webui/webui.py", line 111, in initialize
    modules.sd_models.load_model()
  File "/Users/nikk/stable-diffusion-webui/modules/sd_models.py", line 374, in load_model
    state_dict = get_checkpoint_state_dict(checkpoint_info, timer)
  File "/Users/nikk/stable-diffusion-webui/modules/sd_models.py", line 232, in get_checkpoint_state_dict
    res = read_state_dict(checkpoint_info.filename)
  File "/Users/nikk/stable-diffusion-webui/modules/sd_models.py", line 218, in read_state_dict
    sd = get_state_dict_from_checkpoint(pl_sd)
  File "/Users/nikk/stable-diffusion-webui/modules/sd_models.py", line 191, in get_state_dict_from_checkpoint
    pl_sd = pl_sd.pop("state_dict", pl_sd)
AttributeError: 'NoneType' object has no attribute 'pop'

upon running ./webui.sh. Thanks for the help.

derekhsu commented 1 year ago

You just need to download the original model again. Remember to download the "RAW" model, not the html. Just open your download model or check the size, and you will know what I am saying. I had the same error message "HeaderTooLarge" because I mistakenly download the HTML description of the model.

nikkwong commented 1 year ago

Sorry, you're talking about the models for inference i.e. /models/Stable-Diffusion/**?

ANK789 commented 1 year ago

It turned out that the file download was wrong,thanks

cduk commented 1 year ago

I had the same problem. Used git to download files and repo used LFS which meant files contained only pointers to the real filess. A quick view of the file revealed the problem.

derekhsu commented 1 year ago

I think this issue is done and should be closed?

spinferno commented 1 year ago

I encountered this exception and to fix it, a clue was that it only happened to a single speciic checkpoint. if that's your case, then simply redownload it. this is what I did to fix it. must have been a corrupted copy.

ShashwatNigam99 commented 1 year ago

This error usually comes up when you use git clone to download a repository with lfs files, and only a pointer to them is downloaded. Download the actual lfs files using git lfs pull --include <files to download> --exclude <files to exclude>

Check the size of the files after download just to be sure. Should work fine after that. I believe this issue can be closed now!

dsdanielpark commented 1 year ago
pointer

Please use the pointer file URL as follows.

danielbell99 commented 4 months ago

Install git-lfs

conda install -c conda-forge git-lfs

Configure Git LFS in conda venv

conda activate VENV
git lfs install

Clone the base model architecture

git clone https://huggingface.co/sentence-transformers/msmarco-distilroberta-base-v2
danielajisafe commented 2 months ago

Cloning with LFS using the 1st and 3rd commands in favour of avoiding large files created the issue in my case. I deleted the folder and used the 2nd command directly. Then confirmed that they are the actual files I need, not pointers using ls -lha folder_path

iScreen Shoter - Google Chrome - 240620160419