Open asagi4 opened 1 year ago
As you mentioned this is an upstream problem where there are extra unneeded batches. It was previously resolved for the purposes of this plugin by overriding maximum_batch_area, specifically area = 200 * memory_free
200 is some arbitrary larger number than was set upstream at the time, you could try adjusting it further.
To note, when modules supporting larger batch sizes are available this would succeed, however the issue is not specific to AITemplate, it's just surfacing because there are currently only batch 1 (2, for cfg) modules. It may be best reported upstream so that the batching can be adjusted.
I've found a case where AITemplate fails a generation that succeeds without it.
It's got something to do with how ComfyUI batches conds; in some cases, it'll do chunks with size > 2, which breaks AITemplate.
I can work around the problem by removing the batching logic in
comfy/samplers.py
after the lineto_batch = to_batch_temp[:1]
and limiting it to two at most. See this patchI've attached a workflow to reproduce the issue (on my system, at least).
To run the workflow you'll need my prompt control nodes from https://github.com/asagi4/comfyui-prompt-control since I haven't hit this bug without timestep ranges yet.
It doesn't seem to happen with all seeds, either. I used RevAnimated for the reproduction, but any model should do.
bug.zip