comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
56.18k stars 5.95k forks source link

Error: Could not detect model type when loading checkpoint #4329

Open txhno opened 3 months ago

txhno commented 3 months ago

Expected Behavior

The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node. ComfyUI

Actual Behavior

An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"

Steps to Reproduce

  1. Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
  2. Queue the workflow

Debug Logs

Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors
File "[path]/execution.py", line 152, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
File "[path]/execution.py", line 82, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "[path]/execution.py", line 75, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "[path]/nodes.py", line 518, in load_checkpoint
    out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config
    raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))

I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.

cczw2010 commented 3 months ago

same

ltdrdata commented 3 months ago

What is the url of the model? And what is the size of the model?

kane-le commented 3 months ago

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

ltdrdata commented 3 months ago

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

That is not a checkpoint model. You can not load that model via Checkpoint Loader node. You have to use Load Diffusion Model.

image

kane-le commented 3 months ago

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

That is not a checkpoint model. You can not load that model via Checkpoint Loader node. You have to use Load Diffusion Model.

image

it works to me! thank you! ^0^

图片

I was followed this guide, it confused me

Komorebi-Nine commented 3 months ago

Same with me.

  1. macbook m1 max 32gb
  2. latest comfyui
  3. flux1-dev-fp8.sft and flux1-schnell-fp8.sft

The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?

[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb

ltdrdata commented 3 months ago

Same with me.

  1. macbook m1 max 32gb
  2. latest comfyui
  3. flux1-dev-fp8.sft and flux1-schnell-fp8.sft

The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?

[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb

fp8 17GB is checkpoint and can be loaded via Checkcpoint Loader. Other models should be loaded via Diffusion Loader.

dirgunchik2008 commented 2 months ago

With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4

Or guff + dualclip loader Screenshot_1

taiczhi commented 2 months ago

The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.

taiczhi commented 2 months ago

Expected Behavior

The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node. ComfyUI

Actual Behavior

An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"

Steps to Reproduce

  1. Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
  2. Queue the workflow

Debug Logs

Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors
File "[path]/execution.py", line 152, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
File "[path]/execution.py", line 82, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "[path]/execution.py", line 75, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "[path]/nodes.py", line 518, in load_checkpoint
    out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config
    raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))

I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.

The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.

ltdrdata commented 2 months ago

With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4

Or guff + dualclip loader Screenshot_1

That is not a regular checkpoint loader and that node only compatible with NF4 checkpoint.

Freighter commented 1 month ago

The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.

I completely agree but it is also pretty hard to guess the kind of checkpoint or model you had downloaded in the past .. my collection was multiple TBs when I completely starteted over and run into the same problem again. ;(

lucasjinreal commented 1 day ago

fp8 can be only used on H100.

For v100 or A100s, which models should be used and replace the workflow which node?