Open rafiislambd opened 2 months ago
Any format supported by ComfyUI can be converted to fp8. So it is able to convert models in different formats such as Auraflow, sdxl, HunyanDiT, etc. to fp8 with this custom node. However, if the type of that model is not properly recognized by ComfyUI, an error is likely to occur. For example, the safetensors model of “https://huggingface.co/black-forest-labs/FLUX.1-schnell/tree/main” is not recognized correctly by “load checkpoint” and an error occurs. Therefore, if you try to convert it to fp8 with “load checkpoint”, it will not be recognized correctly and an error will occur. I confirmed that that flux model can be converted to fp8 with the “load diffusion model” node.
how to convert flux1-schnell nf4 model? if possible to convert please share workflow ...thanks
NF4 is a 4-bit floating-point format, a special quantization scheme used in technologies such as QLoRA. On the other hand, this custom node is designed to convert to FP8 (8-bit floating point format). If an NF4 model is used for this custom node, the NF4 model must be converted back to an intermediate format such as FP16 or FP32 and then converted to FP8. I have no plans to add that functionality.
please add a support flux+kolors model........thanks