Closed Siifu closed 8 months ago
@Acly looking forward to you reply :)
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
This usually means that the file is corrupt. It doesn't say which file, but since you say it happens with LCM, it's probably the LCM LoRA. Try to delete and re-download them.
They're in server path ComfyUI/models/loras
Links: lcm-lora-sdv1-5.safetensors, lcm-lora-sdxl.safetensors
Thank you. I originally thought that the problem was with the sampler, and that my configuration environment was missing the sampler for LCM. However, following your suggestion to investigate the issue with LoRa, I found the reason: I had used virtual drive mapping (using the "mlink" command on Windows and the "ln -s" command on macOS) to map the "loras" folder from my external hard drive to the "comfyui" folder. This caused the "loras" directory in "comfyui" to contain virtual LoRa devices that couldn't function properly, as they were named with a prefix period in the list.
I assume that your workflow defaults to selecting the first available "lora" in the list, so these duplicate but useless virtual LoRa devices were causing errors in the LCM sampler's workflow.
Now that I've moved the "loras" to a local location, everything is working fine. Thank you very much!
That makes sense, the name matching for Lora files allows an arbitrary prefix so people can organize them into folders, it likely picked up one of the files prefixed with .
Hey,I use M1 max MBP to run the project, here is the error screenshot
__my terminal: ERROR:root:!!! Exception during processing !!! ERROR:root:Traceback (most recent call last): File "/Users/sifu/ComfyUI/execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sifu/ComfyUI/execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sifu/ComfyUI/execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sifu/ComfyUI/nodes.py", line 569, in load_lora lora = comfy.utils.load_torch_file(lora_path, safe_load=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sifu/ComfyUI/comfy/utils.py", line 13, in load_torch_file sd = safetensors.torch.load_file(ckpt, device=device.type) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sifu/miniconda3/lib/python3.11/site-packages/safetensors/torch.py", line 308, in load_file with safe_open(filename, framework="pt", device=device) as f: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
__my client.log 2023-12-22 20:08:29,627 ERROR Job 52f5aa72-e8ed-47fc-a901-26e93fa65c09 failed: Error while deserializing header: HeaderTooLarge [' File "/Users/sifu/ComfyUI/execution.py", line 153, in recursive_execute\n output_data, output_ui = get_output_data(obj, input_data_all)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ' File "/Users/sifu/ComfyUI/execution.py", line 83, in get_output_data\n return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ' File "/Users/sifu/ComfyUI/execution.py", line 76, in map_node_over_list\n results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ' File "/Users/sifu/ComfyUI/nodes.py", line 569, in load_lora\n lora = comfy.utils.load_torch_file(lora_path, safe_load=True)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ' File "/Users/sifu/ComfyUI/comfy/utils.py", line 13, in load_torch_file\n sd = safetensors.torch.load_file(ckpt, device=device.type)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ' File "/Users/sifu/miniconda3/lib/python3.11/site-packages/safetensors/torch.py", line 308, in load_file\n with safe_open(filename, framework="pt", device=device) as f:\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n']