comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
58.56k stars 6.21k forks source link

Error while deserializing header: HeaderTooLarge #5826

Open Aipathon opened 2 days ago

Aipathon commented 2 days ago

Expected Behavior

ComfyUI Generating the Image

Actual Behavior

Seeing described Error

Steps to Reproduce

Install Flux with the neccessary other modules as I got told in #5824

Run the workflow

See the Error.

Debug Logs

## Error Details
- **Node ID:** 11
- **Node Type:** DualCLIPLoader
- **Exception Type:** safetensors_rust.SafetensorError
- **Exception Message:** Error while deserializing header: HeaderTooLarge
## Stack Trace

  File "/home/Aipathon/git/ComfyUI/execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/Aipathon/git/ComfyUI/execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/Aipathon/git/ComfyUI/execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "/home/Aipathon/git/ComfyUI/execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/Aipathon/git/ComfyUI/nodes.py", line 952, in load_clip
    clip = comfy.sd.load_clip(ckpt_paths=[clip_path1, clip_path2], embedding_directory=folder_paths.get_folder_paths("embeddings"), clip_type=clip_type)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/Aipathon/git/ComfyUI/comfy/sd.py", line 478, in load_clip
    clip_data.append(comfy.utils.load_torch_file(p, safe_load=True))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/Aipathon/git/ComfyUI/comfy/utils.py", line 34, in load_torch_file
    sd = safetensors.torch.load_file(ckpt, device=device.type)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/Aipathon/.local/lib/python3.12/site-packages/safetensors/torch.py", line 313, in load_file
    with safe_open(filename, framework="pt", device=device) as f:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

System Information

Logs

2024-11-28T17:15:31.930801 - Total VRAM 12272 MB, total RAM 32011 MB
2024-11-28T17:15:31.930857 - pytorch version: 2.5.1+rocm6.2
2024-11-28T17:15:31.930990 - Set vram state to: NORMAL_VRAM
2024-11-28T17:15:31.931065 - Device: cuda:0 AMD Radeon RX 6700 XT : native
2024-11-28T17:15:32.845822 - Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
2024-11-28T17:15:33.664698 - [Prompt Server] web root: /home/Aipathon/git/ComfyUI/web
2024-11-28T17:15:34.188790 - 
Import times for custom nodes:
2024-11-28T17:15:34.188846 -    0.0 seconds: /home/Aipathon/git/ComfyUI/custom_nodes/websocket_image_save.py
2024-11-28T17:15:34.188870 - 
2024-11-28T17:15:34.192146 - Starting server

2024-11-28T17:15:34.192401 - To see the GUI go to: http://127.0.0.1:8188
2024-11-28T17:15:56.614670 - got prompt
2024-11-28T17:15:56.810073 - Using split attention in VAE
2024-11-28T17:15:56.811238 - Using split attention in VAE
2024-11-28T17:15:58.159079 - model weight dtype torch.bfloat16, manual cast: None
2024-11-28T17:15:58.166335 - model_type FLUX
2024-11-28T17:16:13.774713 - !!! Exception during processing !!! Error while deserializing header: HeaderTooLarge
2024-11-28T17:16:13.778969 - Traceback (most recent call last):
  File "/home/Aipathon/git/ComfyUI/execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Aipathon/git/ComfyUI/execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Aipathon/git/ComfyUI/execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "/home/Aipathon/git/ComfyUI/execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Aipathon/git/ComfyUI/nodes.py", line 952, in load_clip
    clip = comfy.sd.load_clip(ckpt_paths=[clip_path1, clip_path2], embedding_directory=folder_paths.get_folder_paths("embeddings"), clip_type=clip_type)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Aipathon/git/ComfyUI/comfy/sd.py", line 478, in load_clip
    clip_data.append(comfy.utils.load_torch_file(p, safe_load=True))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Aipathon/git/ComfyUI/comfy/utils.py", line 34, in load_torch_file
    sd = safetensors.torch.load_file(ckpt, device=device.type)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Aipathon/.local/lib/python3.12/site-packages/safetensors/torch.py", line 313, in load_file
    with safe_open(filename, framework="pt", device=device) as f:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge

2024-11-28T17:16:13.779510 - Prompt executed in 17.16 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":37,"last_link_id":116,"nodes":[{"id":17,"type":"BasicScheduler","pos":[480,1008],"size":[315,106],"flags":{},"order":13,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":55,"slot_index":0}],"outputs":[{"name":"SIGMAS","type":"SIGMAS","links":[20],"shape":3}],"properties":{"Node name for S&R":"BasicScheduler"},"widgets_values":["simple",20,1]},{"id":16,"type":"KSamplerSelect","pos":[480,912],"size":[315,58],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"SAMPLER","type":"SAMPLER","links":[19],"shape":3}],"properties":{"Node name for S&R":"KSamplerSelect"},"widgets_values":["euler"]},{"id":26,"type":"FluxGuidance","pos":[480,144],"size":[317.4000244140625,58],"flags":{},"order":12,"mode":0,"inputs":[{"name":"conditioning","type":"CONDITIONING","link":41}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[42],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"FluxGuidance"},"widgets_values":[3.5],"color":"#233","bgcolor":"#355"},{"id":22,"type":"BasicGuider","pos":[576,48],"size":[222.3482666015625,46],"flags":{},"order":14,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":54,"slot_index":0},{"name":"conditioning","type":"CONDITIONING","link":42,"slot_index":1}],"outputs":[{"name":"GUIDER","type":"GUIDER","links":[30],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"BasicGuider"},"widgets_values":[]},{"id":13,"type":"SamplerCustomAdvanced","pos":[864,192],"size":[272.3617858886719,124.53733825683594],"flags":{},"order":15,"mode":0,"inputs":[{"name":"noise","type":"NOISE","link":37,"slot_index":0},{"name":"guider","type":"GUIDER","link":30,"slot_index":1},{"name":"sampler","type":"SAMPLER","link":19,"slot_index":2},{"name":"sigmas","type":"SIGMAS","link":20,"slot_index":3},{"name":"latent_image","type":"LATENT","link":116,"slot_index":4}],"outputs":[{"name":"output","type":"LATENT","links":[24],"slot_index":0,"shape":3},{"name":"denoised_output","type":"LATENT","links":null,"shape":3}],"properties":{"Node name for S&R":"SamplerCustomAdvanced"},"widgets_values":[]},{"id":25,"type":"RandomNoise","pos":[480,768],"size":[315,82],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"NOISE","type":"NOISE","links":[37],"shape":3}],"properties":{"Node name for S&R":"RandomNoise"},"widgets_values":[1037904651296659,"randomize"],"color":"#2a363b","bgcolor":"#3f5159"},{"id":8,"type":"VAEDecode","pos":[866,367],"size":[210,46],"flags":{},"order":16,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":24},{"name":"vae","type":"VAE","link":12}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[9],"slot_index":0}],"properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":6,"type":"CLIPTextEncode","pos":[384,240],"size":[422.84503173828125,164.31304931640625],"flags":{},"order":10,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":10}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[41],"slot_index":0}],"title":"CLIP Text Encode (Positive Prompt)","properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["cute anime girl with massive fluffy fennec ears and a big fluffy tail blonde messy long hair blue eyes wearing a maid outfit with a long black gold leaf pattern dress and a white apron mouth open holding a fancy black forest cake with candles on top in the kitchen of an old dark Victorian mansion lit by candlelight with a bright window to the foggy forest and very expensive stuff everywhere"],"color":"#232","bgcolor":"#353"},{"id":30,"type":"ModelSamplingFlux","pos":[480,1152],"size":[315,130],"flags":{},"order":11,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":56,"slot_index":0},{"name":"width","type":"INT","link":115,"slot_index":1,"widget":{"name":"width"}},{"name":"height","type":"INT","link":114,"slot_index":2,"widget":{"name":"height"}}],"outputs":[{"name":"MODEL","type":"MODEL","links":[54,55],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"ModelSamplingFlux"},"widgets_values":[1.15,0.5,1024,1024]},{"id":27,"type":"EmptySD3LatentImage","pos":[480,624],"size":[315,106],"flags":{},"order":9,"mode":0,"inputs":[{"name":"width","type":"INT","link":112,"widget":{"name":"width"}},{"name":"height","type":"INT","link":113,"widget":{"name":"height"}}],"outputs":[{"name":"LATENT","type":"LATENT","links":[116],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"EmptySD3LatentImage"},"widgets_values":[1024,1024,1]},{"id":34,"type":"PrimitiveNode","pos":[432,480],"size":[210,82],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","links":[112,115],"slot_index":0,"widget":{"name":"width"}}],"title":"width","properties":{"Run widget replace on values":false},"widgets_values":[1024,"fixed"],"color":"#323","bgcolor":"#535"},{"id":35,"type":"PrimitiveNode","pos":[672,480],"size":[210,82],"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","links":[113,114],"slot_index":0,"widget":{"name":"height"}}],"title":"height","properties":{"Run widget replace on values":false},"widgets_values":[1024,"fixed"],"color":"#323","bgcolor":"#535"},{"id":37,"type":"Note","pos":[480,1344],"size":[314.99755859375,117.98363494873047],"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["The reference sampling implementation auto adjusts the shift value based on the resolution, if you don't want this you can just bypass (CTRL-B) this ModelSamplingFlux node.\n"],"color":"#432","bgcolor":"#653"},{"id":28,"type":"Note","pos":[48,576],"size":[336,288],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["If you get an error in any of the nodes above make sure the files are in the correct directories.\n\nSee the top of the examples page for the links : https://comfyanonymous.github.io/ComfyUI_examples/flux/\n\nflux1-dev.safetensors goes in: ComfyUI/models/unet/\n\nt5xxl_fp16.safetensors and clip_l.safetensors go in: ComfyUI/models/clip/\n\nae.safetensors goes in: ComfyUI/models/vae/\n\n\nTip: You can set the weight_dtype above to one of the fp8 types if you have memory issues."],"color":"#432","bgcolor":"#653"},{"id":9,"type":"SaveImage","pos":[1155,196],"size":[985.3012084960938,1060.3828125],"flags":{},"order":17,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":9}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":11,"type":"DualCLIPLoader","pos":[33,275],"size":[315,106],"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[10],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"DualCLIPLoader"},"widgets_values":["ComfyUI_TextEncoder/flux_text_encoders/clip_l.safetensors","ComfyUI_TextEncoder/flux_text_encoders/t5xxl_fp8_e4m3fn.safetensors","flux"]},{"id":12,"type":"UNETLoader","pos":[48,144],"size":[315,82],"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[56],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"UNETLoader"},"widgets_values":["FLUX/flux_dev.safetensors","default"],"color":"#223","bgcolor":"#335"},{"id":10,"type":"VAELoader","pos":[48,432],"size":[311.81634521484375,60.429901123046875],"flags":{},"order":8,"mode":0,"inputs":[],"outputs":[{"name":"VAE","type":"VAE","links":[12],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["ae.safetensors"]}],"links":[[9,8,0,9,0,"IMAGE"],[10,11,0,6,0,"CLIP"],[12,10,0,8,1,"VAE"],[19,16,0,13,2,"SAMPLER"],[20,17,0,13,3,"SIGMAS"],[24,13,0,8,0,"LATENT"],[30,22,0,13,1,"GUIDER"],[37,25,0,13,0,"NOISE"],[41,6,0,26,0,"CONDITIONING"],[42,26,0,22,1,"CONDITIONING"],[54,30,0,22,0,"MODEL"],[55,30,0,17,0,"MODEL"],[56,12,0,30,0,"MODEL"],[112,34,0,27,0,"INT"],[113,35,0,27,1,"INT"],[114,35,0,30,2,"INT"],[115,34,0,30,1,"INT"],[116,27,0,13,4,"LATENT"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.6830134553650705,"offset":[329.28331000000037,-126.39162999999948]},"groupNodes":{"EmptyLatentImage":{"nodes":[{"type":"PrimitiveNode","pos":[432,480],"size":{"0":210,"1":82},"flags":{},"order":6,"mode":0,"outputs":[{"name":"INT","type":"INT","links":[],"widget":{"name":"height"},"slot_index":0}],"title":"height","properties":{"Run widget replace on values":false},"color":"#323","bgcolor":"#535","index":0},{"type":"PrimitiveNode","pos":[672,480],"size":{"0":210,"1":82},"flags":{},"order":7,"mode":0,"outputs":[{"name":"INT","type":"INT","links":[],"slot_index":0,"widget":{"name":"width"}}],"title":"width","properties":{"Run widget replace on values":false},"color":"#323","bgcolor":"#535","index":1},{"type":"EmptySD3LatentImage","pos":[480,624],"size":{"0":315,"1":106},"flags":{},"order":10,"mode":0,"inputs":[{"name":"width","type":"INT","link":null,"widget":{"name":"width"}},{"name":"height","type":"INT","link":null,"widget":{"name":"height"}}],"outputs":[{"name":"LATENT","type":"LATENT","links":[],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"EmptySD3LatentImage"},"widgets_values":[1024,1024,1],"index":2}],"links":[[1,0,2,0,34,"INT"],[0,0,2,1,35,"INT"]],"external":[[0,0,"INT"],[1,0,"INT"],[2,0,"LATENT"]],"config":{"0":{"output":{"0":{"name":"height"}},"input":{"value":{"visible":true}}},"1":{"output":{"0":{"name":"width"}},"input":{"value":{"visible":true}}},"2":{"input":{"width":{"visible":false},"height":{"visible":false}}}}}}},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)



### Other

The Error also occures if I use different Models in the DUALClipLoader
ltdrdata commented 2 days ago

Make sure the model files are valid.

Aipathon commented 2 days ago

Make sure the model files are valid.

They are correct and in the right spot

ltdrdata commented 2 days ago

Make sure the model files are valid.

They are correct and in the right spot

How did you validate the file?

Aipathon commented 2 days ago

Make sure the model files are valid.

They are correct and in the right spot

How did you validate the file?

I checked if the name is the same as the one that should be there

ltdrdata commented 2 days ago

Make sure the model files are valid.

They are correct and in the right spot

How did you validate the file?

I checked if the name is the same as the one that should be there

No. You have to check if the file is corrupted. Typically, that message means the file is broken.

Typically, this happens when the download fails midway or when a link is downloaded instead of the model file itself.