yolain / ComfyUI-Easy-Use

In order to make it easier to use the ComfyUI, I have made some optimizations and integrations to some commonly used nodes.
GNU General Public License v3.0
875 stars 56 forks source link

使用新的fluxloader,无论是用flux1-dev-bnb-nf4-v2.safetensors,或者是flux1-schnell_fp8_unet_vae_clip,偶尔会报4-bit quantization data type None is not implemented.错误 #325

Open terliu opened 3 weeks ago

terliu commented 3 weeks ago

如题,已安装了ComfyUI_bitsandbytes_NF4插件。 如果是加载flux1-schnell_fp8_unet_vae_clip模型会出现下面错误 image image 如果加载flux1-dev-bnb-nf4-v2.safetensors模型会报下面的错误 image

terliu commented 3 weeks ago

换成下面这个工作流又能跑通 image

terliu commented 3 weeks ago

!!! Exception during processing !!! 4-bit quantization data type None is not implemented. Traceback (most recent call last): File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 316, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 191, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 168, in _map_node_over_list process_inputs(input_dict, i) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 157, in process_inputs results.append(getattr(obj, func)(**inputs)) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5392, in simple return super().run(pipe, None, None, None, None, None, image_output, link_id, save_prefix, File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5361, in run return process_sample_state(pipe, samp_model, samp_clip, samp_samples, samp_vae, samp_seed, samp_positive, samp_negative, steps, start_step, last_step, cfg, sampler_name, scheduler, denoise, image_output, link_id, save_prefix, tile_size, prompt, extra_pnginfo, my_unique_id, preview_latent, force_full_denoise, disable_noise, samp_custom) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5131, in process_sample_state sampsamples, = sampler.custom_advanced_ksampler(noise, guider, _sampler, sigmas, samp_samples) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\libs\sampler.py", line 182, in custom_advanced_ksampler samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 706, in sample self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\sampler_helpers.py", line 66, in prepare_sampling comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required, minimum_memory_required=minimum_memory_required) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 527, in load_models_gpu cur_loaded_model = loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 325, in model_load raise e File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 319, in model_load self.real_model = self.model.patch_model_lowvram(device_to=patch_model_to, lowvram_model_memory=lowvram_model_memory, force_patch_weights=force_patch_weights) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_patcher.py", line 442, in patch_model_lowvram self.lowvram_load(device_to, lowvram_model_memory=lowvram_model_memory, force_patch_weights=force_patch_weights) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_patcher.py", line 427, in lowvram_load x[2].to(device_to) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1160, in to return self._apply(convert) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 833, in _apply param_applied = fn(param) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1158, in convert return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\bitsandbytes_NF4__init__.py", line 62, in to return self._quantize(device) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\bitsandbytes\nn\modules.py", line 289, in _quantize w_4bit, quant_state = bnb.functional.quantize_4bit( File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\bitsandbytes\functional.py", line 1157, in quantize_4bit raise NotImplementedError(f"4-bit quantization data type {quant_type} is not implemented.") NotImplementedError: 4-bit quantization data type None is not implemented.

yolain commented 3 weeks ago

不需要额外安装ComfyUI_bitsandbytes_NF4呢,已经整合进去了。 nf4量化的v1 v2 还有其他的都可以使用最上面那个流程,它是可以跑通的。 gguf量化的要单独分别加载unet、clip、vae 输入到override。 其他all-in-one的checkpoint我这边要做个修补。之前fluxLoader 强制了nf4.

terliu commented 3 weeks ago

很奇怪的是发现你的那2个示例,EasyUse_flux_fp8_t2i.jsonEasyUse_flux_bnb_nf4_t2i.json。能跑通有偶然性的,有时候是可以的,大部分不行,报NotImplementedError: 4-bit quantization data type None is not implemented.错误。另外,ComfyUI_bitsandbytes_NF4这个是可以删除了吗,他的支持库“bitsandbytes>=0.43.0 ”也不需要了吗?

terliu commented 3 weeks ago

image 这次又能运行!中午反复测试都不行

terliu commented 3 weeks ago

image 换成V2版的,出现下面错误 During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 316, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 191, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 168, in _map_node_over_list process_inputs(input_dict, i) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\execution.py", line 157, in process_inputs results.append(getattr(obj, func)(**inputs)) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5392, in simple return super().run(pipe, None, None, None, None, None, image_output, link_id, save_prefix, File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5361, in run return process_sample_state(pipe, samp_model, samp_clip, samp_samples, samp_vae, samp_seed, samp_positive, samp_negative, steps, start_step, last_step, cfg, sampler_name, scheduler, denoise, image_output, link_id, save_prefix, tile_size, prompt, extra_pnginfo, my_unique_id, preview_latent, force_full_denoise, disable_noise, samp_custom) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5131, in process_sample_state sampsamples, = sampler.custom_advanced_ksampler(noise, guider, _sampler, sigmas, samp_samples) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\libs\sampler.py", line 182, in custom_advanced_ksampler samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 706, in sample self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\sampler_helpers.py", line 66, in prepare_sampling comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required, minimum_memory_required=minimum_memory_required) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 527, in load_models_gpu cur_loaded_model = loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 323, in model_load self.model.unpatch_model(self.model.offload_device) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\comfy\model_patcher.py", line 634, in unpatch_model self.model.to(device_to) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1160, in to return self._apply(convert) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 810, in _apply module._apply(fn) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 810, in _apply module._apply(fn) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 833, in _apply param_applied = fn(param) File "G:\comfyui\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1158, in convert return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) File "G:\comfyui\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\bitsandbytes_NF4__init__.py", line 71, in to quant_storage=self.quant_storage, AttributeError: 'ForgeParams4bit' object has no attribute 'quant_storage'. Did you mean: 'quant_state'?

yolain commented 3 weeks ago

很奇怪的是发现你的那2个示例,EasyUse_flux_fp8_t2i.jsonEasyUse_flux_bnb_nf4_t2i.json。能跑通有偶然性的,有时候是可以的,大部分不行,报NotImplementedError: 4-bit quantization data type None is not implemented.错误。另外,ComfyUI_bitsandbytes_NF4这个是可以删除了吗,他的支持库“bitsandbytes>=0.43.0 ”也不需要了吗?

在你运行nf4 模型之前 如果默认没有 bitsandbytes 我会先运行依赖下载。 所以这个包不需要了。我给的是>=0.43.3的,你看下版本, 我记得之前低于这个版本 也出现过你这个报错好像

terliu commented 3 weeks ago

确实是升级到0.43.3后问题解决,谢谢!