ThisisBillhe / tiny-stable-diffusion

Tiny optimized Stable-diffusion that can run on GPUs with just 1GB of VRAM. (Beta)
Other
148 stars 13 forks source link

quantize custom model #1

Closed KohakuBlueleaf closed 11 months ago

KohakuBlueleaf commented 11 months ago

As title If I want to quantize my own model can I just use the script in PTQD? or we need to change something like dataset for reconstruction or bits width

ThisisBillhe commented 11 months ago

yes, you can quantize your model with PTQD. It works fine with a higher bitwidth (>=4). If your model structure varies significantly from unet used in stable-diffusion, the code for reconstruction may need to be modified.

KohakuBlueleaf commented 11 months ago

@ThisisBillhe Thx for informations!

I have other few questions

1, will you support SDXL in the future? which use different repository (the sgm from stability.ai) 2, will you consider to make/impl/support sd-webui or comfyui? which are common gui client for stable diffusion

I can try to do these 2 by myself but I want to check if you have any plans on it

ThisisBillhe commented 11 months ago

well, I am currently working on the research of extremely low-bit quantization of diffusion models, so the answer is no for now.

KohakuBlueleaf commented 11 months ago

@ThisisBillhe ok!

moonlightian commented 6 months ago

@ThisisBillhe Thx for informations!

I have other few questions

1, will you support SDXL in the future? which use different repository (the sgm from stability.ai) 2, will you consider to make/impl/support sd-webui or comfyui? which are common gui client for stable diffusion

I can try to do these 2 by myself but I want to check if you have any plans on it

Hi~Have you worked out with low bit SDXL? And would that be convenient for you to share some data about the quantized model?