Open txhno opened 3 months ago
same
What is the url of the model? And what is the size of the model?
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
That is not a checkpoint model. You can not load that model via Checkpoint Loader
node.
You have to use Load Diffusion Model
.
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
That is not a checkpoint model. You can not load that model via
Checkpoint Loader
node. You have to useLoad Diffusion Model
.
it works to me! thank you! ^0^
I was followed this guide, it confused me
Same with me.
The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?
[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb
Same with me.
- macbook m1 max 32gb
- latest comfyui
- flux1-dev-fp8.sft and flux1-schnell-fp8.sft
The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?
[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb
fp8 17GB is checkpoint and can be loaded via Checkcpoint Loader. Other models should be loaded via Diffusion Loader.
With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
Or guff + dualclip loader
The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.
Expected Behavior
The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node.
Actual Behavior
An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"
Steps to Reproduce
- Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
- Queue the workflow
Debug Logs
Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors File "[path]/execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "[path]/execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "[path]/execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "[path]/nodes.py", line 518, in load_checkpoint out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings")) File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))
I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.
The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.
With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
Or guff + dualclip loader
That is not a regular checkpoint loader and that node only compatible with NF4 checkpoint.
The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.
I completely agree but it is also pretty hard to guess the kind of checkpoint or model you had downloaded in the past .. my collection was multiple TBs when I completely starteted over and run into the same problem again. ;(
fp8 can be only used on H100.
For v100 or A100s, which models should be used and replace the workflow which node?
Expected Behavior
The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node.
Actual Behavior
An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"
Steps to Reproduce
Debug Logs
I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.