Open amir84ferdos opened 23 hours ago
What's your workflow?
Exception Message: 'NoneType' object has no attribute 'model_size'
File "/home/user/ComfyUI/execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "/home/user/ComfyUI/execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/comfy_extras/nodes_flux.py", line 21, in encode
return (clip.encode_from_tokens_scheduled(tokens, add_dict={"guidance": guidance}), )
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/comfy/sd.py", line 143, in encode_from_tokens_scheduled
pooled_dict = self.encode_from_tokens(tokens, return_pooled=return_pooled, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/comfy/sd.py", line 204, in encode_from_tokens
self.load_model()
File "/home/user/ComfyUI/comfy/sd.py", line 237, in load_model
model_management.load_model_gpu(self.patcher)
File "/home/user/ComfyUI/comfy/model_management.py", line 517, in load_model_gpu
return load_models_gpu([model])
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/comfy/model_management.py", line 485, in load_models_gpu
free_memory(total_memory_required[device] * 1.1 + extra_mem, device)
File "/home/user/ComfyUI/comfy/model_management.py", line 413, in free_memory
can_unload.append((-shift_model.model_offloaded_memory(), sys.getrefcount(shift_model.model), shift_model.model_memory(), i))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/ComfyUI/comfy/model_management.py", line 318, in model_offloaded_memory
return self.model.model_size() - self.model.loaded_size()
^^^^^^^^^^^^^^^^^^^^^
## System Information
- **ComfyUI Version:** v0.3.6-3-g0ee322e
- **Arguments:** /home/user/ComfyUI/main.py --listen 0.0.0.0
- **OS:** posix
- **Python Version:** 3.12.7 (main, Oct 1 2024, 11:15:50) [GCC 14.2.1 20240910]
- **Embedded Python:** false
- **PyTorch Version:** 2.5.1+cu124
## Devices
- **Name:** cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
- **Type:** cuda
- **VRAM Total:** 25323503616
- **VRAM Free:** 5310267890
- **Torch VRAM Total:** 19696451584
- **Torch VRAM Free:** 72565234
## Logs
... 2024-12-02T18:06:10.290841 - Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code. 2024-12-02T18:06:10.522830 - WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere. 2024-12-02T18:06:10.524050 - Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code. 2024-12-02T18:06:10.755786 - WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere. 2024-12-02T18:06:10.755933 - Requested to load FluxClipModel_ 2024-12-02T18:06:10.756020 - Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code. 2024-12-02T18:06:10.988477 - WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere. 2024-12-02T18:06:10.988676 - !!! Exception during processing !!! 'NoneType' object has no attribute 'model_size' 2024-12-02T18:06:10.989697 - Traceback (most recent call last): File "/home/user/ComfyUI/execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "/home/user/ComfyUI/execution.py", line 158, in process_inputs results.append(getattr(obj, func)(*inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/comfy_extras/nodes_flux.py", line 21, in encode return (clip.encode_from_tokens_scheduled(tokens, add_dict={"guidance": guidance}), ) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/comfy/sd.py", line 143, in encode_from_tokens_scheduled pooled_dict = self.encode_from_tokens(tokens, return_pooled=return_pooled, return_dict=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/comfy/sd.py", line 204, in encode_from_tokens self.load_model() File "/home/user/ComfyUI/comfy/sd.py", line 237, in load_model model_management.load_model_gpu(self.patcher) File "/home/user/ComfyUI/comfy/model_management.py", line 517, in load_model_gpu return load_models_gpu([model]) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/comfy/model_management.py", line 485, in load_models_gpu free_memory(total_memory_required[device] 1.1 + extra_mem, device) File "/home/user/ComfyUI/comfy/model_management.py", line 413, in free_memory can_unload.append((-shift_model.model_offloaded_memory(), sys.getrefcount(shift_model.model), shift_model.model_memory(), i)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/ComfyUI/comfy/model_management.py", line 318, in model_offloaded_memory return self.model.model_size() - self.model.loaded_size() ^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'model_size'
2024-12-02T18:06:10.990346 - Prompt executed in 0.70 seconds
## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
{"last_node_id":151,"last_link_id":1392,"nodes":[{"id":126,"type":"PreviewImage","pos":[-80,-1365],"size":[292.6629638671875,456.1589660644531],"flags":{"collapsed":false},"order":16,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":1379}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[],"color":"#222","bgcolor":"#000","shape":1},{"id":127,"type":"SaveImage","pos":[230,-1360],"size":[398.1168212890625,633.26171875],"flags":{"collapsed":false},"order":17,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":1380}],"outputs":[],"title":"Save Flux Image","properties":{"Node name for S&R":"SaveImage"},"widgets_values":["FLUX"],"color":"#222","bgcolor":"#000","shape":1},{"id":130,"type":"Seed Everywhere","pos":[-555,-1545],"size":[340,90],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","links":[],"slot_index":0}],"properties":{"Node name for S&R":"Seed Everywhere","group_restricted":0,"color_restricted":0},"widgets_values":[106265685439735,"fixed"]},{"id":135,"type":"VAELoader","pos":[650,355],"size":[315,58],"flags":{"collapsed":true},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"VAE","type":"VAE","links":[1367,1371],"slot_index":0}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["FLUX1/ae.safetensors"]},{"id":142,"type":"FluxResolutionNode","pos":[-650,365],"size":[315,170],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"width","type":"INT","links":[1357]},{"name":"height","type":"INT","links":[1358],"slot_index":1},{"name":"resolution","type":"STRING","links":null}],"properties":{"Node name for S&R":"FluxResolutionNode"},"widgets_values":["1.0","3:5 (Elegant Vertical)",false,"1:1"]},{"id":143,"type":"EmptyLatentImage","pos":[-295,400],"size":[315,106],"flags":{"collapsed":true},"order":8,"mode":0,"inputs":[{"name":"width","type":"INT","link":1357,"widget":{"name":"width"}},{"name":"height","type":"INT","link":1358,"widget":{"name":"height"}}],"outputs":[{"name":"LATENT","type":"LATENT","links":[1365],"slot_index":0}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[512,512,1]},{"id":144,"type":"BasicGuider","pos":[-70,350],"size":[241.79998779296875,46],"flags":{"collapsed":false},"order":11,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":1390,"slot_index":0},{"name":"conditioning","type":"CONDITIONING","link":1359,"slot_index":1}],"outputs":[{"name":"GUIDER","type":"GUIDER","links":[1362],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"BasicGuider"},"widgets_values":[]},{"id":145,"type":"ConditioningZeroOut","pos":[1290,490],"size":[317.4000244140625,26],"flags":{"collapsed":true},"order":12,"mode":0,"inputs":[{"name":"conditioning","type":"CONDITIONING","link":1360}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[1370],"slot_index":0}],"properties":{"Node name for S&R":"ConditioningZeroOut"},"widgets_values":[]},{"id":146,"type":"SamplerCustomAdvanced","pos":[215,280],"size":[355.20001220703125,326],"flags":{},"order":13,"mode":0,"inputs":[{"name":"noise","type":"NOISE","link":1361},{"name":"guider","type":"GUIDER","link":1362},{"name":"sampler","type":"SAMPLER","link":1363},{"name":"sigmas","type":"SIGMAS","link":1364},{"name":"latent_image","type":"LATENT","link":1365}],"outputs":[{"name":"output","type":"LATENT","links":null},{"name":"denoised_output","type":"LATENT","links":[1366],"slot_index":1}],"properties":{"Node name for S&R":"SamplerCustomAdvanced"},"widgets_values":[]},{"id":147,"type":"VAEDecode","pos":[815,305],"size":[210,46],"flags":{},"order":14,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":1366},{"name":"vae","type":"VAE","link":1367}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[1368,1379],"slot_index":0}],"title":"Flux Image Out","properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":148,"type":"UltimateSDUpscale","pos":[1495,400],"size":[315,826],"flags":{"collapsed":false},"order":15,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":1368},{"name":"model","type":"MODEL","link":1391},{"name":"positive","type":"CONDITIONING","link":1369},{"name":"negative","type":"CONDITIONING","link":1370},{"name":"vae","type":"VAE","link":1371},{"name":"upscale_model","type":"UPSCALE_MODEL","link":1372},{"name":"seed","type":"INT","link":null,"widget":{"name":"seed"}}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[1380],"slot_index":0,"shape":3}],"title":"Flux Upscale","properties":{"Node name for S&R":"UltimateSDUpscale"},"widgets_values":[2,567933527686538,"randomize",8,1,"deis","normal",0.28,"Linear",1024,1024,16,32,"None",1,64,8,16,true,false]},{"id":136,"type":"RandomNoise","pos":[-125,195],"size":[315,82],"flags":{"collapsed":false},"order":3,"mode":0,"inputs":[{"name":"noise_seed","type":"INT","link":null,"widget":{"name":"noise_seed"}}],"outputs":[{"name":"NOISE","type":"NOISE","links":[1361],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"RandomNoise"},"widgets_values":[1024963360558584,"randomize"]},{"id":138,"type":"BasicScheduler","pos":[-220,655],"size":[315,106],"flags":{},"order":9,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":1389,"slot_index":0}],"outputs":[{"name":"SIGMAS","type":"SIGMAS","links":[1364],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"BasicScheduler"},"widgets_values":["beta",40,1],"color":"#322","bgcolor":"#533"},{"id":137,"type":"KSamplerSelect","pos":[-220,530],"size":[315,58],"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[{"name":"SAMPLER","type":"SAMPLER","links":[1363],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"KSamplerSelect"},"widgets_values":["dpmpp_2m"]},{"id":149,"type":"UnetLoaderGGUF","pos":[-2185,125],"size":[315,58],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[1389,1390,1391],"slot_index":0}],"properties":{"Node name for S&R":"UnetLoaderGGUF"},"widgets_values":["FLUX1/flux1-dev-Q8_0.gguf"]},{"id":141,"type":"UpscaleModelLoader","pos":[1145,500],"size":[315,58],"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[{"name":"UPSCALE_MODEL","type":"UPSCALE_MODEL","links":[1372],"shape":3}],"title":"Flux Upscale Model","properties":{"Node name for S&R":"UpscaleModelLoader"},"widgets_values":["4x_NMKD-Siax_200k.pth"]},{"id":151,"type":"DualCLIPLoader","pos":[-2160,-90],"size":[315,106],"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[1392],"slot_index":0}],"properties":{"Node name for S&R":"DualCLIPLoader"},"widgets_values":["t5/google_t5-v1_1-xxl_encoderonly-fp16.safetensors","clip_l.safetensors","flux"]},{"id":134,"type":"CLIPTextEncodeFlux","pos":[-1565,-400],"size":[991.6920776367188,442.3399353027344],"flags":{"collapsed":false},"order":10,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":1392,"slot_index":0}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[1359,1360,1369],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"CLIPTextEncodeFlux"},"widgets_values":["","A person.",3.5],"color":"#232","bgcolor":"#353"}],"links":[[1357,142,0,143,0,"INT"],[1358,142,1,143,1,"INT"],[1359,134,0,144,1,"CONDITIONING"],[1360,134,0,145,0,"CONDITIONING"],[1361,136,0,146,0,"NOISE"],[1362,144,0,146,1,"GUIDER"],[1363,137,0,146,2,"SAMPLER"],[1364,138,0,146,3,"SIGMAS"],[1365,143,0,146,4,"LATENT"],[1366,146,1,147,0,"LATENT"],[1367,135,0,147,1,"VAE"],[1368,147,0,148,0,"IMAGE"],[1369,134,0,148,2,"CONDITIONING"],[1370,145,0,148,3,"CONDITIONING"],[1371,135,0,148,4,"VAE"],[1372,141,0,148,5,"UPSCALE_MODEL"],[1379,147,0,126,0,"IMAGE"],[1380,148,0,127,0,"IMAGE"],[1387,130,0,148,6,"INT"],[1388,130,0,136,0,"INT"],[1389,149,0,138,0,"MODEL"],[1390,149,0,144,0,"MODEL"],[1391,149,0,148,1,"MODEL"],[1392,151,0,134,0,"CLIP"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.5730855330116875,"offset":[3123.359892116617,1400.0149698697119]},"ue_links":[{"downstream":148,"downstream_slot":6,"upstream":"130","upstream_slot":0,"controller":130,"type":"INT"},{"downstream":136,"downstream_slot":0,"upstream":"130","upstream_slot":0,"controller":130,"type":"INT"}],"groupNodes":{}},"version":0.4}
## Additional Context
This only occurs on multiple runs. The first run works as expected.
Got today the same problem after updating via "Update all" in Manager. As noted above, the first run is going well. But if then change the parameters of generation this error appears on multiple runs.
Does it happen if you don't use the gguf loader?
Does it happen if you don't use the gguf loader?
no, only with GGUF loader
+1 and this in ksampler
!!! Exception during processing !!! ControlNet.get_control() missing 1 required positional argument: 'transformer_options' Traceback (most recent call last): File "X:\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "X:\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\a1111_compat.py", line 195, in doit return (self.sample(*args, **kwargs)[0],) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\a1111_compat.py", line 188, in sample return inspire_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\a1111_compat.py", line 96, in inspire_ksampler raise e File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\a1111_compat.py", line 84, in inspire_ksampler samples = common.impact_sampling( ^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\libs\common.py", line 14, in impact_sampling return nodes.NODE_CLASS_MAPPINGS['RegionalSampler'].separated_sample(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\special_samplers.py", line 312, in separated_sample return separated_sample(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 216, in separated_sample res = sample_with_custom_noise(model, add_noise, seed, cfg, positive, negative, impact_sampler, sigmas, latent_image, noise=noise, callback=callback) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 160, in sample_with_custom_noise samples = comfy.sample.sample_custom(model, noise, cfg, sampler, sigmas, positive, negative, latent_image, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 48, in sample_custom samples = comfy.samplers.sample(model, noise, positive, negative, cfg, model.load_device, sampler, sigmas, model_options=model.model_options, latent_image=latent_image, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 135, in sample return orig_fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 918, in sample return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 904, in sample output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute return self.original(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 873, in outer_sample output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 857, in inner_sample samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute return self.original(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 100, in KSAMPLER_sample return orig_fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion\utils.py", line 34, in KSAMPLER_sample return orig_fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 714, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler denoised = model(x, sigma_hat * s_in, **extra_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 384, in __call__ out = self.inner_model(x, sigma, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 839, in __call__ return self.predict_noise(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 842, in predict_noise return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 175, in sampling_function out = orig_fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 364, in sampling_function out = calc_cond_batch(model, conds, x, timestep, model_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 200, in calc_cond_batch return executor.execute(model, conds, x_in, timestep, model_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute return self.original(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 311, in _calc_cond_batch output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep_, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "X:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion\tiled_diffusion.py", line 798, in __call__ c_tile['control'] = c_in['control'].get_control_orig(x_tile, t_tile, c_tile, len(cond_or_uncond)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: ControlNet.get_control() missing 1 required positional argument: 'transformer_options'
It happens with unet loader and flux unet models for me. No problem with nf4 checkpoint loader.
The same error happens to me using SwarmUI, it is easy to reproduce: on first load of GUI select any LoRA and generate one image, everything is OK. Than unselect the LoRA and try to generate again. It gives the error "NoneType' object has no attribute 'model_size" and then the only way to get rid of this error is to reload completely the ComfyUI server. Until yesterday all was working perfectly. I use GGUF Q8 model, don't know if it happens to others.
The same error happens to me using SwarmUI, it is easy to reproduce: on first load of GUI select any LoRA and generate one image, everything is OK. Than unselect the LoRA and try to generate again. It gives the error "NoneType' object has no attribute 'model_size" and then the only way to get rid of this error is to reload completely the ComfyUI server. Until yesterday all was working perfectly. I use GGUF Q8 model, don't know if it happens to others.
Same in ComfyUI when change loRa weights.
Each time this is accompanied by a message: "Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code" in the Terminal and a reference to the
...\ComfyUI\comfy\model_management.py", line 318, in model_offloaded_memory return self.model.model_size() - self.model.loaded_size()
GGUF Workflow
got prompt
Potential memory leak detected with model FluxClipModel_, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Potential memory leak detected with model FluxClipModel_, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Requested to load Flux
Potential memory leak detected with model FluxClipModel_, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
loaded partially 2630.992 2629.947509765625 0
0%| | 0/15 [01:50<?, ?it/s]
Processing interrupted
Prompt executed in 131.68 seconds
got prompt
Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Requested to load FluxClipModel_
loaded completely 9.5367431640625e+25 5062.70263671875 True
Attempting to release mmap (319)
Processing interrupted
Prompt executed in 514.77 seconds
got prompt
Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
Requested to load Flux
Potential memory leak detected with model Flux, doing a full garbage collect, for maximum performance avoid circular references in the model code.
WARNING, memory leak with model Flux. Please make sure it is not being referenced from somewhere.
WARNING, memory leak with model FluxClipModel_. Please make sure it is not being referenced from somewhere.
loaded partially 3604.368505859375 3603.658203125 0
Attempting to release mmap (236)
[ComfyUI-Manager] Restore snapshot to 2024-11-29_11-39-36_autosave
if i restore snapshot to 2024-11-29_11-39-36_autosave everything works fine and i don't have those messages, i don't think its the GGUF nodes as there are no changes in them restoring
Expected Behavior
it was working fine
Actual Behavior
after updating it is broken
Steps to Reproduce
NoneType' object has no attribute 'model_size
Debug Logs
Other
No response