camenduru / stable-diffusion-webui-colab

stable diffusion webui colab
The Unlicense
15.58k stars 2.61k forks source link

[Bug]: Why cant I change models? #366

Closed huajitellyou closed 1 year ago

huajitellyou commented 1 year ago

What happened?

When I click to switch models, the progress bar will still be the original model after going through, and then an error will be reported

File "/usr/local/lib/python3.9/dist-packages/safetensors/torch.py", line 98, in load_file with safe_open(filename, framework="pt", device=device) as f: Exception: Error while deserializing header: HeaderTooLarge

Colab cell output

Loading weights [18f76d0ac4] from /content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/models/Stable-diffusion/anything-v4.5-vae-swapped.safetensors
changing setting sd_model_checkpoint to anything-v4.5-vae-swapped.safetensors [18f76d0ac4]: Exception
Traceback (most recent call last):
  File "/content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/modules/shared.py", line 568, in set
    self.data_labels[key].onchange()
  File "/content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/modules/call_queue.py", line 15, in f
    res = func(*args, **kwargs)
  File "/content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/webui.py", line 146, in <lambda>
    shared.opts.onchange("sd_model_checkpoint", wrap_queued_call(lambda: modules.sd_models.reload_model_weights()))
  File "/content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/modules/sd_models.py", line 488, in reload_model_weights
    state_dict = get_checkpoint_state_dict(checkpoint_info, timer)
  File "/content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/modules/sd_models.py", line 262, in get_checkpoint_state_dict
    res = read_state_dict(checkpoint_info.filename)
  File "/content/drive/MyDrive/stable-diffusion-webui-colab/stable-diffusion-webui/modules/sd_models.py", line 241, in read_state_dict
    pl_sd = safetensors.torch.load_file(checkpoint_file, device=device)
  File "/usr/local/lib/python3.9/dist-packages/safetensors/torch.py", line 98, in load_file
    with safe_open(filename, framework="pt", device=device) as f:
Exception: Error while deserializing header: HeaderTooLarge

Which colab and model(s) were you using when the error occurred?

https://huggingface.co/ckpt/anything-v4.5-vae-swapped/resolve/main/anything-v4.5-vae-swapped.safetensors

Which Public WebUI Colab URL were you using when the error occurred?

remote.moe

If you used HiRes mode when the error occurred, please provide the Hires info

No response

Anonimouche commented 1 year ago

Did you try using the cpkt version instead of the safetensors?