Closed sneccc closed 12 months ago
DreamStudio and some other folks working on similar models have been pushing CLIP guidance pretty hard, so I expect we'll want the option.
There's an example we can look at here: https://github.com/huggingface/diffusers/blob/9be94d9c6659f7a0a804874f445291e3a84d61d4/examples/community/clip_guided_stable_diffusion.py#L113
It does look a lot more resource-hungry than the classifier-free guidance we've been using. Both in computation and memory.
DreamStudio and some other folks working on similar models have been pushing CLIP guidance pretty hard, so I expect we'll want the option.
There's an example we can look at here: https://github.com/huggingface/diffusers/blob/9be94d9c6659f7a0a804874f445291e3a84d61d4/examples/community/clip_guided_stable_diffusion.py#L113
It does look a lot more resource-hungry than the classifier-free guidance we've been using. Both in computation and memory.
yeah it is resource hungry, but for people that rent gpus A6000,3090 etc it would be fine
There has been no activity in this issue for 14 days. If this issue is still being experienced, please reply with an updated confirmation that the issue is still being experienced with the latest release.
Will you guys implement Clip Guidance feature like in dreamstudio?