Open XpucT opened 9 months ago
Same issue here, looking forward to see the changes!
Faced the same problem few days ago. Thought it's just on my side. Please fix it.
It already performs the controlnet per tile, am I misunderstanding?
It already performs the controlnet per tile, am I misunderstanding?
Yes it working with model, and not with preprocesses.
Ah ok I think I understand. A problem will be that the controlnet already runs before the node, so I'll have to throw away the first preprocessed image and reapply every tile, but I think that'll do the job.
On second thought, I won't have access to the pre-processors, so I won't be able to run them.
On second thought, I won't have access to the pre-processors, so I won't be able to run them.
ComfyUI_TiledKSampler support slicing controlnets https://github.com/BlenderNeko/ComfyUI_TiledKSampler/blob/4fcc003f62efdbdec2f79eab37e1fb8dd84bac64/nodes.py#L255
On second thought, I won't have access to the pre-processors, so I won't be able to run them.
ComfyUI_TiledKSampler support slicing controlnets https://github.com/BlenderNeko/ComfyUI_TiledKSampler/blob/4fcc003f62efdbdec2f79eab37e1fb8dd84bac64/nodes.py#L255
That's already being done in this node. If I'm understanding the issue correctly, then the suggestion is that the preprocessors should be applied to each tile after they have been upscaled so there's more detail for the sampling. The only solution I can think of is making a new node that has the preprocessors hardcoded and toggleable with the node settings, so it won't be very flexible.
For now, probably the best way to do this is to make the image larger before passing it to the preprocessors. Hopefully the preprocessor will give better detail on the larger image. I don't know how the size of the image impacts the preprocessors, but ideally these two workflows are the same: [image -> resize -> preprocessor -> tile] is the same as [image -> tile -> resize -> preprocessor] where the latter is what happens in A1111, and the former is my suggestion.
On second thought, I won't have access to the pre-processors, so I won't be able to run them.
ComfyUI_TiledKSampler support slicing controlnets https://github.com/BlenderNeko/ComfyUI_TiledKSampler/blob/4fcc003f62efdbdec2f79eab37e1fb8dd84bac64/nodes.py#L255
That's already being done in this node. If I'm understanding the issue correctly, then the suggestion is that the preprocessors should be applied to each tile after they have been upscaled so there's more detail for the sampling. The only solution I can think of is making a new node that has the preprocessors hardcoded and toggleable with the node settings, so it won't be very flexible.
For now, probably the best way to do this is to make the image larger before passing it to the preprocessors. Hopefully the preprocessor will give better detail on the larger image. I don't know how the size of the image impacts the preprocessors, but ideally these two workflows are the same: [image -> resize -> preprocessor -> tile] is the same as [image -> tile -> resize -> preprocessor] where the latter is what happens in A1111, and the former is my suggestion.
Looking forward to see the new node!
@ssitu Man. Any news?
https://github.com/ssitu/ComfyUI_UltimateSDUpscale/assets/1196046/818e4fae-f69c-4c50-9f85-ef4d2ff992e0
USDU.json