huggingface / diffusers

๐Ÿค— Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
https://huggingface.co/docs/diffusers
Apache License 2.0
25.92k stars 5.34k forks source link

JAX / Flax: avoid recompilation on params change: guidance scale, size, number of steps #1015

Closed pcuenca closed 1 year ago

pcuenca commented 2 years ago

This is required to improve the demo(s), we currently have the sliders disabled.

Some users are also requesting it in discord and the forums. Our demo notebook is very fast on TPU hardware but users can't quickly explore different parameters.

camenduru commented 2 years ago

Please make a script that can convert original stable diffusion to flax ๐Ÿš€

camenduru commented 2 years ago

and please make an auto-converter if someone publishes it like this https://huggingface.co/nitrosocke/modern-disney-diffusion it will automatically convert to all different models ๐Ÿค–๐Ÿงจ

patrickvonplaten commented 1 year ago

Hey @camenduru it should be as easy as doing:

from diffusers import FlaxStableDiffusionPipeline

pipeline, params = FlaxStableDiffusionPipeline.from_pretrained("<path-to-stable-diffusion-checkpoint>", from_pt=True)
pipeline.save_pretrained("<path/to/hub/repo/with/flax/branch>", params=params)
patrickvonplaten commented 1 year ago

We should add better docs for this though and an automated converter tool sounds like a great idea indeed!

camenduru commented 1 year ago

@patrickvonplaten โœจ thanks I converted https://huggingface.co/nitrosocke/mo-di-diffusion to flax but how to push to flax branch I opened PR but PR going to main branch ๐Ÿ†˜

camenduru commented 1 year ago

and I converted like this

pipeline, params = FlaxStableDiffusionPipeline.from_pretrained("nitrosocke/mo-di-diffusion", from_pt=True)
pipeline.save_pretrained("/home/camenduru/mo-di-flax", params=params)

should I put this to bf16 branch or flax branch ?

camenduru commented 1 year ago

I also converted waifu-diffusion โœจ haru will add here ๐ŸŽ‰

patrickvonplaten commented 1 year ago

and I converted like this

pipeline, params = FlaxStableDiffusionPipeline.from_pretrained("nitrosocke/mo-di-diffusion", from_pt=True)
pipeline.save_pretrained("/home/camenduru/mo-di-flax", params=params)

should I put this to bf16 branch or flax branch ?

I'd put it on a flax branch :-)

patrickvonplaten commented 1 year ago

Hey @camenduru,

Thanks for opening the issue on the Hub. I answered there: https://huggingface.co/nitrosocke/mo-di-diffusion/discussions/8#6362a3672c0bb59ecc191a07 . Please ping me if you need my help going forward :-) If ok for the author of the repo, I'd open a branch directly on main actually. Flax and PyTorch weights can live in the same repo branch side-by-side.

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.