apple / ml-stable-diffusion

Stable Diffusion with Core ML on Apple Silicon
MIT License
16.8k stars 937 forks source link

Inpainting support #148

Open aajank opened 1 year ago

aajank commented 1 year ago

does this support inpainting or do we have to wait further ?

ZachNagengast commented 1 year ago

+1 to this

I see there was some plans for it already, but commented out? Is anyone actively working on it?

https://github.com/apple/ml-stable-diffusion/blame/48f07f24891155a14c51dd835bba7371bdf32d0e/swift/StableDiffusion/pipeline/StableDiffusionPipeline.Configuration.swift#L14

jrittvo commented 1 year ago

I have read that in the python-diffusers world, inpainting works best with dedicated models that have an additional layer (in the Unet?) to directly accept the inpaint mask data. This helps preserve the mask edge details. I wonder if the ControlledUnet we now have also has an extra layer, to accept the ControlNet input, and if inpainting could leverage this, and be much simpler to implement now than it was before ControlNet?

ZachNagengast commented 1 year ago

Looks like this PR added hole punching support, allowing a ControlNet model to do inpainting. More info here https://github.com/godly-devotion/MochiDiffusion/pull/272

ynagatomo commented 1 year ago

It works fine. ‎SD15ControlNet11InPaint ‎001