sipie800 / ComfyUI-PuLID-Flux-Enhanced

Apache License 2.0
79 stars 13 forks source link

error: expected scalar type Half but found BFloat16 #5

Closed ClothingAI closed 1 month ago

ClothingAI commented 1 month ago

I keep getting this error. and the workflow stops At the sampler custom advanced: image

I dont understand what todo:

image

What is the problem?

sipie800 commented 1 month ago

I can't see your error message. From my experience, the problem is not about sampler. your unet loader with gguf may be the issue. please try feed back to the gguf loader node. Yet if the thing is not about node, do you use some GPU which does not support bfloat16 ?

ClothingAI commented 1 month ago

Quadro 6000, does that not support it? If no, what to do then? It has 24G vRAM

sipie800 commented 1 month ago

bf16 is supported with at least compute capability 8.0, quandro 6000 may be 7.5. I'm not quite an expert about these things. you need to use fp16 checkpoint of flux as long as the regular unet loader node.

ClothingAI commented 1 month ago

could not even make a regular flux pulid workflow work, can you suggest me a json with fp16 please? with precise name of models so I have no doubt no error

sipie800 commented 1 month ago

may try pulid_flux_16bit_simple.json. Be aware that it's may consume more VRAM. You can just grab and use a fp8 checkpoint of flux dev in the fp16 workflow, it's more feasible.

ClothingAI commented 1 month ago

may try pulid_flux_16bit_simple.json. Be aware that it's may consume more VRAM. You can just grab and use a fp8 checkpoint of flux dev in the fp16 workflow, it's more feasible.

Thanks! I have enough VRAM it seems: image

But I just keep getting a specific error apparently: https://github.com/balazik/ComfyUI-PuLID-Flux/issues/33#issuecomment-2434144823

ClothingAI commented 1 month ago

Even this: image