lquesada / ComfyUI-Inpaint-CropAndStitch

ComfyUI nodes to crop before sampling and stitch back after sampling that speed up inpainting
GNU General Public License v3.0
398 stars 23 forks source link

Blur Mask and Blend pixels #48

Closed Nekodificador closed 1 day ago

Nekodificador commented 5 days ago

Hola Luis,

¿Sería posible ampliar los márgenes para el blur mask pixels y el blend pixels? Normalmente trabajo los inpaintings con Differencial Diffusion y necesito unos gradientes bastante más grandes, y cuando manipulo la máscara a posteriori suele quedarse fuera de los márgenes originales.

Por otro lado, sería genial tener una opción sin padding, para no generar márgenes extra fuera de la imagen original (aunque luego se recorten, a veces dejan pequeños bordes).

lquesada commented 4 days ago

Hi,

Do you mean increasing the max value for blur pixels and blend pixels? What maximum values would be enough for your use cases?

The nodes don't behave great for very high values, because that increases the context area a lot and the inpainting is way lower resolution.

Alternatively, have you tried just increasing the context area?

On Mon, 25 Nov 2024, 19:53 Nekodificador, @.***> wrote:

Hola Luis,

¿Sería posible ampliar los márgenes para el blur mask pixels y el blend pixels? Normalmente trabajo los inpaintings con Differencial Diffusion y necesito unos gradientes bastante más grandes, y cuando manipulo la máscara a posteriori suele quedarse fuera de los márgenes originales.

Por otro lado, sería genial tener una opción sin padding, para no generar márgenes extra fuera de la imagen original (aunque luego se recorten, a veces dejan pequeños bordes).

— Reply to this email directly, view it on GitHub https://github.com/lquesada/ComfyUI-Inpaint-CropAndStitch/issues/48, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABCKTXPKMY7FN7UGN3GNRET2CNW2VAVCNFSM6AAAAABSOVSGB2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGY4TCOBXHEYDONI . You are receiving this because you are subscribed to this thread.Message ID: @.***>

Nekodificador commented 3 days ago

Personally, I would increase the mask feather to at least 128 or 256 pixels. I can show you an example of how I manually build the same functions as Inpaint Crop using LayerStyle.

As you can see, the mask feather works independently, and I manage the context with the yellow slider. This allows me to blur the mask without directly affecting the context area.

In your nodes, if I expand the context area, it extends beyond the original image. I find this very useful for outpaintings, so I wouldn’t remove it, but it would be interesting to have an option to decide whether this happens or not.

On the other hand, when I need to expand the mask beyond 64 pixels, I have to add an extra blur afterward. The issue is that since I can't manipulate the final blend, and it’s smaller than the mask, it leaves noticeable edges.

The other problem I encounter is that when I blur the mask, it not only affects the one I created but also the original transparency of the image. I suppose at some point in the process they’re merging, which causes the alpha to expand outward, creating margins that enlarge the bounding box. As I said, this is very interesting for creating outpaintings, but considering that the stitch will later restore everything to its original margins, I’m just losing resolution.

image

lquesada commented 3 days ago

hi,

I suggest you edit the .py file locally and try to increase the max values to e.g. 256. From my tests, increasing the blur/blend pixels to that much would eventually crop them at the border of the mask or extend the mask too much, making the inpaint be extremely low resolution. Give it a go with local edits and let me know how it goes.

I get the expanding beyond the original image being an issue, but that happens because very large blend/blur masks are not a use case this node is made for.

I'll leave this open until you report back from your results from the experiment above (increasing locally to e.g. 256). I may consider limiting extending the image if it's easy to do, e.g. extend mask, but then clip to original image area.

Nekodificador commented 3 days ago

I've done what you told me and what I’m noticing is that if you use differential diffusion, there’s practically no need to add a blend pixel, since the edges of the mask receive such low denoise that they blend seamlessly with the original image.

By tweaking the node, I’ve managed to get up to 256 pixels of feather mask without any issues. What does cause the problem is the blend pixels, as it expands the bounding box and leads to the situation we’ve already discussed: the loss of effective resolution.

So, I think the feather mask could be increased without any issues at least to 256 (that should be more than enough for 99% cases). The blend pixels, however, would need some sort of rework (if that’s something you’re planning to address in the future, of course).

If you’d like feedback from me in this regard, I’d be more than happy to test any changes you’d like to make. If not, there’s no problem at all—your nodes already solve plenty of issues as they are. Knowing that I can tweak the Python script to suit my needs is more than enough.

Thanks!

lquesada commented 3 days ago

Thanks for checking! I'll do some experiments and if it doesn't cause major issues (e.g. expanding the context area too widely), I'll increase the max value for blur pixel to 256.

Cheers! Luis.

On Wed, Nov 27, 2024 at 9:15 PM Nekodificador @.***> wrote:

I've done what you told me and what I’m noticing is that if you use differential diffusion, there’s practically no need to add a blend pixel, since the edges of the mask receive such low denoise that they blend seamlessly with the original image.

By tweaking the node, I’ve managed to get up to 256 pixels of feather mask without any issues. What does cause the problem is the blend pixels, as it expands the bounding box and leads to the situation we’ve already discussed: the loss of effective resolution.

So, I think the feather mask could be increased without any issues at least to 256 (that should be more than enough for 99% cases). The blend pixels, however, would need some sort of rework (if that’s something you’re planning to address in the future, of course).

If you’d like feedback from me in this regard, I’d be more than happy to test any changes you’d like to make. If not, there’s no problem at all—your nodes already solve plenty of issues as they are. Knowing that I can tweak the Python script to suit my needs is more than enough.

Thanks!

— Reply to this email directly, view it on GitHub https://github.com/lquesada/ComfyUI-Inpaint-CropAndStitch/issues/48#issuecomment-2504704255, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABCKTXKUNLLT4KAS7USMSOD2CYR5VAVCNFSM6AAAAABSOVSGB2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMBUG4YDIMRVGU . You are receiving this because you commented.Message ID: @.***>

lquesada commented 1 day ago

Hi,

I did some tests and increased the limit for blur_mask_pixels to 256. Thanks!

Luis.

On Wed, Nov 27, 2024 at 9:56 PM Luis Quesada @.***> wrote:

Thanks for checking! I'll do some experiments and if it doesn't cause major issues (e.g. expanding the context area too widely), I'll increase the max value for blur pixel to 256.

Cheers! Luis.

On Wed, Nov 27, 2024 at 9:15 PM Nekodificador @.***> wrote:

I've done what you told me and what I’m noticing is that if you use differential diffusion, there’s practically no need to add a blend pixel, since the edges of the mask receive such low denoise that they blend seamlessly with the original image.

By tweaking the node, I’ve managed to get up to 256 pixels of feather mask without any issues. What does cause the problem is the blend pixels, as it expands the bounding box and leads to the situation we’ve already discussed: the loss of effective resolution.

So, I think the feather mask could be increased without any issues at least to 256 (that should be more than enough for 99% cases). The blend pixels, however, would need some sort of rework (if that’s something you’re planning to address in the future, of course).

If you’d like feedback from me in this regard, I’d be more than happy to test any changes you’d like to make. If not, there’s no problem at all—your nodes already solve plenty of issues as they are. Knowing that I can tweak the Python script to suit my needs is more than enough.

Thanks!

— Reply to this email directly, view it on GitHub https://github.com/lquesada/ComfyUI-Inpaint-CropAndStitch/issues/48#issuecomment-2504704255, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABCKTXKUNLLT4KAS7USMSOD2CYR5VAVCNFSM6AAAAABSOVSGB2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMBUG4YDIMRVGU . You are receiving this because you commented.Message ID: @.***>