nullquant / ComfyUI-BrushNet

ComfyUI BrushNet nodes
Apache License 2.0
575 stars 20 forks source link

Discussion on the effect of inpaint #26

Closed xxxbrokenboi closed 2 months ago

xxxbrokenboi commented 5 months ago

When I use your nodes, I noticed that after inverting the mask, my areas that shouldn't have been inpainted were affected, like img 2 and 3 below, the text that was regular got blurred by the inpaint, and I was wondering if you know why, and if it could be some parameter reason, I adjusted the STEPS & SCALE, and it looks like the problem isn't fixed. Thanks for your work!

  1. image
  2. original img image
  3. inpainted img image
nullquant commented 5 months ago

It is common problem. First, VAE blurs the images. You can try to load image, VAE encode it, then VAE decode and look at result. Second, SD repaints whole image even during inpainting with mask. So the solution is to blend both images, such as you can preserve original image at non-masked areas. Look at new example I added.

caniyabanci76 commented 5 months ago

I was not aware of this. Does this mean that all other implementations of inpainting do this basic blending of the original and inpainted image with the mask applied (it is just a hidden step) ?

nullquant commented 5 months ago

As far as I know no, but you usually have blur all across the image. BTW do you know that when you zoom image in ComfyUI workflow your browser "helps" you by softening and smoothing it? To compare real images you should use something like (https://www.faststone.org/).

xxxbrokenboi commented 5 months ago

I also had this thought, when I was playing around with a lot of open ipaint demos deployed on huggingface, such as Tencent's brushnet demo, and running the same product background transformation task, I was getting outputs that were not blurred, and I wondered what kind of trick they were using, because normal blend will always result in a slightly unnatural product lighting and shadowing, but it didn't seem to have these issues on their demo image