ssitu / ComfyUI_fabric

ComfyUI nodes based on the paper "FABRIC: Personalizing Diffusion Models with Iterative Feedback" (Feedback via Attention-Based Reference Image Conditioning)
GNU General Public License v3.0
86 stars 6 forks source link

SDXL Error #6

Closed za-wa-n-go closed 10 months ago

za-wa-n-go commented 1 year ago

!!! Exception during processing !!! Traceback (most recent call last): File "C:\Product\ComfyUI\execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 181, in sample return KSamplerFABRICAdv().sample(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 138, in sample return fabric_sample(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 52, in fabric_sample samples = KSamplerAdvanced().sample(model_patched, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\nodes.py", line 1270, in sample return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\nodes.py", line 1206, in common_ksampler samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\sample.py", line 97, in sample samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_smZNodes__init.py", line 120, in KSampler_sample return _KSampler_sample(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 785, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler(), sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_smZNodes__init__.py", line 128, in sample return _sample(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 690, in sample samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 630, in sample samples = getattr(k_diffusionsampling, "sample{}".format(sampler_name))(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\venv\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\k_diffusion\sampling.py", line 580, in sample_dpmpp_2m denoised = model(x, sigmas[i] * s_in, extra_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\venv\Lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 323, in forward out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, cond_concat=cond_concat, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\venv\Lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\k_diffusion\external.py", line 125, in forward eps = self.get_eps(input * c_in, self.sigma_to_t(sigma), kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\k_diffusion\external.py", line 151, in get_eps return self.inner_model.apply_model(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 849, in apply_model out = super().apply_model(x, timestep, cc, uu, cond_scale, cond_concat, model_options, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 311, in apply_model out = sampling_function(self.inner_model.apply_model, x, timestep, uncond, cond, cond_scale, cond_concat, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 289, in sampling_function cond, uncond = calc_cond_uncond_batch(model_function, cond, uncond, x, timestep, max_total_area, cond_concat, model_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\samplers.py", line 263, in calc_cond_uncond_batch output = model_options['model_function_wrapper'](model_function, {"input": inputx, "timestep": timestep, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 282, in unet_wrapper out = model_func(input, ts, c) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_utils.py", line 17, in setattr(resolved_obj, func_path[-1], lambda args, kwargs: self(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_utils.py", line 28, in call return self.orig_func(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\model_base.py", line 63, in apply_model return self.diffusion_model(xc, t, context=context, y=c_adm, control=control, transformer_options=transformer_options).float() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\venv\Lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\SeargeSDXL\modules\custom_sdxl_ksampler.py", line 70, in new_unet_forward x0 = old_unet_forward(self, x, timesteps, context, y, control, transformer_options, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\FreeU_Advanced\nodes.py", line 173, in tempforward h = forward_timestep_embed(module, h, emb, context, transformer_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 56, in forward_timestep_embed x = layer(x, context, transformer_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\venv\Lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\ldm\modules\attention.py", line 695, in forward x = block(x, context=context[i], transformer_options=transformer_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\venv\Lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\ldm\modules\attention.py", line 525, in forward return checkpoint(self._forward, (x, context, transformer_options), self.parameters(), self.checkpoint) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\ldm\modules\diffusionmodules\util.py", line 123, in checkpoint return func(inputs) ^^^^^^^^^^^^^ File "C:\Product\ComfyUI\comfy\ldm\modules\attention.py", line 569, in _forward n, context_attn1, value_attn1 = p(n, context_attn1, value_attn1, extra_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Product\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 140, in modified_attn1 assert neg_hs.shape[0] == num_neg, f"neg_hs batch size ({neg_hs.shape[0]}) != number of neg_latents ({num_neg})" ^^^^^^^^^^^^^^^^^^^^^^^^^^ AssertionError: neg_hs batch size (3) != number of neg_latents (1)

Prompt executed in 45.51 seconds

Baughn commented 1 year ago

I spent a bit of time debugging the code. Not that it worked, but here's some ideas, I suppose...

The batch (of conditioning?) in unet_wrapper consistently seems to be exaggerated by 2x. For a 1-picture batch, I get a batch size of 2. For a 4-picture batch it's 8, etc. That's presumably the cause of the errors, though it's not as simple as dividing by 2 there, even for a hack. c['c_adm'] also comes in with a shape of [8, ...] and so on, and I don't know how to track down the cause. But-

SDXL, notably, uses two text encoders. You would expect it to use twice as much conditioning as SD 1.5/2, since it has both CLIP_G and CLIP_L.

Though that breaks further in, in this case.

I don't know if any of the above makes sense, but something to look into?

= = =

Also, and likely unrelated, but it reliably fails (in a different manner) if the batch size in ComfyUI != the total number of latents, as can happen if you use Rebatch Latents. (Which is generally useful, since it means I can generate 24 pictures at once, even though my GPU can only handle 10.)

(I wonder if that's what happened in the stack trace above? It looks familiar.)

= = =

If you lack a GPU capable of running SDXL to test on, I'd be happy to lend you one!

ssitu commented 1 year ago

Yeah right now I am pulling the clip embeddings out of the unprocessed conditioning and putting that into a dict. Instead, I'll have to pass the conditioning through a KSampler and pull the processed conditioning out. Still trying to figure that out.

I believe the error above wasn't from Rebatch Latents since @za-wa-n-go found it happening in this workflow: https://github.com/ssitu/ComfyUI_fabric/issues/4#issuecomment-1732701454

I still can't trigger that assertion error on my end.

Baughn commented 1 year ago

You mean SDXL works for you? Or Za-wa-n's error specifically?

ssitu commented 1 year ago

I mean that @za-wa-n-go was getting that error with a 1.5 model with a workflow that didn't use Rebatch Latents, but I can't reproduce it.

ssitu commented 1 year ago

I might have fixed this with e5e8897539853a77d20e5a364010f3d73e5d6f6d, anyone mind trying to see if it works with SDXL now?

revolvedai commented 11 months ago

It works with SDXL! Still seems to have memory handling issues however. For example the first run I can use as many latents as I want in multiple fabric pickers, but after that it seems to run out of memory on the next run from the top.

ssitu commented 10 months ago

It works with SDXL! Still seems to have memory handling issues however. For example the first run I can use as many latents as I want in multiple fabric pickers, but after that it seems to run out of memory on the next run from the top.

I’d have to look at the workflow to get some more Information.

ssitu commented 10 months ago

Since SDXL works, I'll close the issue.