BlenderNeko / ComfyUI_TiledKSampler

Tiled samplers for ComfyUI
GNU General Public License v3.0
297 stars 17 forks source link

not working after comfyui update #28

Closed MYKY69 closed 8 months ago

MYKY69 commented 8 months ago

after updating comfyui from git getting this:

ERROR:root:!!! Exception during processing !!! ERROR:root:Traceback (most recent call last): File "/home/myky/Stažené/git/ComfyUI/execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/myky/Stažené/git/ComfyUI/execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/myky/Stažené/git/ComfyUI/execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(*slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/myky/Stažené/git/ComfyUI/custom_nodes/ComfyUI_TiledKSampler/nodes.py", line 312, in sample return sample_common(model, 'enable', seed, tile_width, tile_height, tiling_strategy, steps_total, cfg, sampler_name, scheduler, positive, negative, latent_image, steps_total-steps, steps_total, 'disable', denoise=1.0, preview=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/myky/Stažené/git/ComfyUI/custom_nodes/ComfyUI_TiledKSampler/nodes.py", line 123, in sample_common comfy.model_management.load_models_gpu([model] + modelPatches, comfy.model_management.batch_area_memory(noise.shape[0] noise.shape[2] * noise.shape[3]) + inference_memory) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: module 'comfy.model_management' has no attribute 'batch_area_memory'

seems like comfyui update killed it :(

MYKY69 commented 8 months ago

https://github.com/comfyanonymous/ComfyUI/commit/dd4ba68b6e93a562d9499eff34e50dbbbc8714e7 seems like this is the commit that broke it

OliverCrosby commented 8 months ago

comfyanonymous/ComfyUI@dd4ba68 seems like this is the commit that broke it

I removed the following code on line 582 of comfy/model_management.py and it seems to have fixed the issue for now:

`def batch_area_memory(area): if xformers_enabled() or pytorch_attention_flash_attention():

TODO: these formulas are copied from maximum_batch_area below

    return (area / 20) * (1024 * 1024)
else:
    return (((area * 0.6) / 0.9) + 1024) * (1024 * 1024)`
Lerc commented 8 months ago

Using the new memory required method seems to be working for me changing

comfy.model_management.load_models_gpu([model] + modelPatches, comfy.model_management.batch_area_memory(noise.shape[0] * noise.shape[2] * noise.shape[3]) + inference_memory)

to

comfy.model_management.load_models_gpu([model] + modelPatches, model.memory_required(noise.shape) + inference_memory)