Shiba-2-shiba / ComfyUI_DiffusionModel_fp8_converter

A custom ComfyUI node for models/clips fp8 converter
11 stars 4 forks source link

please add a support flux+kolors model #3

Open rafiislambd opened 2 months ago

rafiislambd commented 2 months ago

please add a support flux+kolors model........thanks

Shiba-2-shiba commented 2 months ago

Any format supported by ComfyUI can be converted to fp8. So it is able to convert models in different formats such as Auraflow, sdxl, HunyanDiT, etc. to fp8 with this custom node. However, if the type of that model is not properly recognized by ComfyUI, an error is likely to occur. For example, the safetensors model of “https://huggingface.co/black-forest-labs/FLUX.1-schnell/tree/main” is not recognized correctly by “load checkpoint” and an error occurs. Therefore, if you try to convert it to fp8 with “load checkpoint”, it will not be recognized correctly and an error will occur. I confirmed that that flux model can be converted to fp8 with the “load diffusion model” node.

rafiislambd commented 2 months ago

how to convert flux1-schnell nf4 model? if possible to convert please share workflow ...thanks

1

Shiba-2-shiba commented 2 months ago

NF4 is a 4-bit floating-point format, a special quantization scheme used in technologies such as QLoRA. On the other hand, this custom node is designed to convert to FP8 (8-bit floating point format). If an NF4 model is used for this custom node, the NF4 model must be converted back to an intermediate format such as FP16 or FP32 and then converted to FP8. I have no plans to add that functionality.