ssitu / ComfyUI_fabric

ComfyUI nodes based on the paper "FABRIC: Personalizing Diffusion Models with Iterative Feedback" (Feedback via Attention-Based Reference Image Conditioning)
GNU General Public License v3.0
84 stars 6 forks source link

Error idx = extra_options['transformer_index'] #21

Closed manzonif closed 8 months ago

manzonif commented 9 months ago

Error occurred when executing KSamplerFABRIC:

'transformer_index'

File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 181, in sample return KSamplerFABRICAdv().sample(*args, *kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 138, in sample return fabric_sample(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 52, in fabric_sample samples = KSamplerAdvanced().sample(model_patched, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive, File "D:\ai\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1333, in sample return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise) File "D:\ai\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1269, in common_ksampler samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample return original_sample(*args, kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations. File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 178, in animatediff_sample return orig_comfy_sample(model, noise, *args, *kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 100, in sample samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes__init__.py", line 129, in KSampler_sample return _KSampler_sample(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 711, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes__init.py", line 138, in sample return _sample(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 617, in sample samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 556, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 580, in sample_dpmpp_2m denoised = model(x, sigmas[i] * s_in, *extra_args) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl return self._call_impl(args, kwargs) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl return forward_call(*args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 277, in forward out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, *kwargs) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl return forward_call(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 267, in forward return self.apply_model(*args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 900, in apply_model out = super().apply_model(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 264, in apply_model out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 252, in sampling_function cond, uncond = calc_cond_uncond_batch(model, cond, uncond, x, timestep, model_options) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 228, in calc_cond_uncond_batch output = model_options['model_function_wrapper'](model.apply_model, {"input": inputx, "timestep": timestep, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 202, in unet_wrapper return model_func(input, ts, c) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_utils.py", line 17, in setattr(resolved_obj, func_path[-1], lambda args, kwargs: self(*args, **kwargs)) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_utils.py", line 28, in call return self.orig_func(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 73, in apply_model model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, extra_conds).float() File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl return self._call_impl(args, kwargs) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl return forward_call(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\SeargeSDXL\modules\custom_sdxl_ksampler.py", line 70, in new_unet_forward x0 = old_unet_forward(self, x, timesteps, context, y, control, transformer_options, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\FreeU_Advanced\nodes.py", line 173, in tempforward h = forward_timestep_embed(module, h, emb, context, transformer_options) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 46, in forward_timestep_embed x = layer(x, context, transformer_options) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl return self._call_impl(args, kwargs) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl return forward_call(*args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 600, in forward x = block(x, context=context[i], transformer_options=transformer_options) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, *kwargs) File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl return forward_call(args, kwargs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 427, in forward return checkpoint(self._forward, (x, context, transformer_options), self.parameters(), self.checkpoint) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\diffusionmodules\util.py", line 190, in checkpoint return func(*inputs) File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 466, in _forward n, context_attn1, value_attn1 = p(n, context_attn1, value_attn1, extra_options) File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 113, in store_hidden_states idx = extra_options['transformer_index']

I don't not what extra_options['transformer_index'] is, but the error could be related to this:

#This is needed because accelerate makes a copy of transformer_options which breaks "transformer_index"
def forward_timestep_embed(ts, x, emb, context=None, transformer_options={}, output_shape=None, time_context=None, num_video_frames=None, image_only_indicator=None):
    for layer in ts:
        if isinstance(layer, VideoResBlock):
            x = layer(x, emb, num_video_frames, image_only_indicator)
        elif isinstance(layer, TimestepBlock):
            x = layer(x, emb)
        elif isinstance(layer, SpatialVideoTransformer):
            x = layer(x, context, time_context, num_video_frames, image_only_indicator, transformer_options)
            if "transformer_index" in transformer_options:
                transformer_options["transformer_index"] += 1
        elif isinstance(layer, SpatialTransformer):
            x = layer(x, context, transformer_options)
            if "transformer_index" in transformer_options:
                transformer_options["transformer_index"] += 1
        elif isinstance(layer, Upsample):
            x = layer(x, output_shape=output_shape)
        else:
            x = layer(x)
    return x

from comfyui and this from ComfyUI-AnimateDiff-Evolved updated few days ago

manzonif commented 9 months ago

Just to confirm that the Comfyui updates from last November 23rd to support SVD are the cause of the error. I rolled back and the above error no longer occurs. It seems like something like this needs to be added somewhere:

if "current_index" in extra_options:
    extra_options["transformer_index"] = extra_options["current_index"] + 1

However, even with the previous version of Comfyui I have memory related problems on a Windows machine. It seems that the resources are not released correctly, so that by running the example workflow, (fabric_round_by_round.json), in the third round, with sd 1.5, I reach 24 GB of vram, with sdxl I go out of memory.

ssitu commented 9 months ago

Thank you for the suggestions, I’ll get to updating this repo soon. Could you send the error log for when you run out of memory?

manzonif commented 9 months ago

So, I understood that it goes out of memory if I set the empty latent size to 1024 x 1024. If I leave it at 512 x 512, it also works with sdxl, although it is almost 24 GB. Could it be that such a size is not foreseen for the latent?

Anyway, here's the log:

Error occurred when executing KSamplerFABRIC:

Allocation on device 0 would exceed memory. (out of memory)
Currently allocated: 21.44 GiB
Requested : 1.33 GiB
Device limit: 23.99 GiB
Free (according to CUDA): 0 bytes
PyTorch limit (set by user-supplied memory fraction)
: 17179869184.00 GiB

File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 181, in sample
return KSamplerFABRICAdv().sample(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 138, in sample
return fabric_sample(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 52, in fabric_sample
samples = KSamplerAdvanced().sample(model_patched, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive,
File "D:\ai\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1320, in sample
return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1256, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample
raise e
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 178, in animatediff_sample
return orig_comfy_sample(model, noise, *args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed= seeds)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 129, in KSampler_sample
return _KSampler_sample(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 711, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 138, in sample
return _sample(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 617, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 556, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 580, in sample_dpmpp_2m
denoised = model(x, sigmas[i] * s_in, **extra_args)
File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 277, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, mode
ssitu commented 9 months ago

Looks like the memory error doesn't directly happen from the code in this repo, so it'd be hard to track down where the problem is. Does sd 1.5 do the same if you use a large enough latent?

manzonif commented 9 months ago

Using an SD 1.5 model, in the third round it slows down a lot, but it doesn't go out of memory.

manzonif commented 9 months ago

Here is the log from prompt window:

ERROR:root:!!! Exception during processing !!!
ERROR:root:Traceback (most recent call last):
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 181, in sample
    return KSamplerFABRICAdv().sample(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 138, in sample
    return fabric_sample(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 52, in fabric_sample
    samples = KSamplerAdvanced().sample(model_patched, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive,
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1320, in sample
    return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1256, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample
    raise e
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
    return original_sample(*args, **kwargs)  # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 178, in animatediff_sample
    return orig_comfy_sample(model, noise, *args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 100, in sample
    samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 129, in KSampler_sample
    return _KSampler_sample(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 711, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 138, in sample
    return _sample(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 617, in sample
    samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 556, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 580, in sample_dpmpp_2m
    denoised = model(x, sigmas[i] * s_in, **extra_args)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 277, in forward
    out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 267, in forward
    return self.apply_model(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 900, in apply_model
    out = super().apply_model(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 264, in apply_model
    out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 252, in sampling_function
    cond, uncond = calc_cond_uncond_batch(model, cond, uncond, x, timestep, model_options)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 228, in calc_cond_uncond_batch
    output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep_, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 200, in unet_wrapper
    return model_func(input, ts, **c)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_utils.py", line 17, in <lambda>
    setattr(resolved_obj, func_path[-1], lambda *args, **kwargs: self(*args, **kwargs))
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_utils.py", line 28, in __call__
    return self.__orig_func(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 68, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\SeargeSDXL\modules\custom_sdxl_ksampler.py", line 70, in new_unet_forward
    x0 = old_unet_forward(self, x, timesteps, context, y, control, transformer_options, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\FreeU_Advanced\nodes.py", line 188, in __temp__forward
    h = forward_timestep_embed(self.middle_block, h, emb, context, transformer_options)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 37, in forward_timestep_embed
    x = layer(x, context, transformer_options)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 560, in forward
    x = block(x, context=context[i], transformer_options=transformer_options)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 390, in forward
    return checkpoint(self._forward, (x, context, transformer_options), self.parameters(), self.checkpoint)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\diffusionmodules\util.py", line 123, in checkpoint
    return func(*inputs)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 434, in _forward
    n, context_attn1, value_attn1 = p(n, context_attn1, value_attn1, extra_options)
  File "D:\ai\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 115, in store_hidden_states
    self.all_hiddens[idx] = torch.cat([self.all_hiddens[idx], q], dim=0)
torch.cuda.OutOfMemoryError: Allocation on device 0 would exceed allowed memory. (out of memory)
Currently allocated     : 26.53 GiB
Requested               : 1.44 GiB
Device limit            : 23.99 GiB
Free (according to CUDA): 0 bytes
PyTorch limit (set by user-supplied memory fraction)
                        : 17179869184.00 GiB

Prompt executed in 108.66 seconds
manzonif commented 9 months ago

I got out of memory with SD 1.5 with an empty latent 1424 x 1424.

ssitu commented 9 months ago

Interesting, I could run the round-by-round example with SD 1.5 with less than 4 GB of VRAM with an older Comfy version, either an update caused the bad memory usage, or because I am forced to use fp32 because of my GPU. Unfortunately, I can't test SDXL. Could you try the force fp32 option and see if that changes the memory usage?

manzonif commented 9 months ago

With --force-fp32 the memory usage doesn't change for me, (I tried with --force-fp32, --force-fp32 --fp32-vae, --force-fp32 --fp32-vae --fp32- text-enc)

manzonif commented 9 months ago

with SD 1.5, latent 512x512 I reach a peak of 10.5 GB of VRAM, (In the first stage 4 GB)

manzonif commented 9 months ago

In the meantime, with the latest comfyui update, a few hours ago, the error relating to the transformer_index has disappeared.

ssitu commented 9 months ago

Interesting, thanks for letting me know

ssitu commented 8 months ago

Since the original problem is fixed, I will close this issue. Open a new one if there are still memory problems.