comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
55.49k stars 5.86k forks source link

GGUF, bnb_NF4 merges #4488

Open cbresciano opened 2 months ago

cbresciano commented 2 months ago

Your question

I can merge with ComfyUI Flux1 models: FP16 and FP8 Kijai using LoadCheckpont and LoadDiffusioModel respectively, however with the GGUF and bnb-NF4 models I cannot even if I use the appropriate loaders. Is it a problem of the loaders or the ModelMergeFlux1 node?. We must wait for a new version of the loaders?. Because with 4 bit models cannot merge, CheckpointSave complains "no data" error.

Logs

No response

Other

No response

mcmonkey4eva commented 2 months ago

These are entirely different data formats on the inside, to merge them there'd need to be something specialized to dequantize before merging then requantize at the end.

(FP16 and FP8 are both unquantized raw data so they can happily merge as normal)