ThisisBillhe / tiny-stable-diffusion

Tiny optimized Stable-diffusion that can run on GPUs with just 1GB of VRAM. (Beta)
Other
155 stars 13 forks source link

quantize my own model to 2bits #4

Open mason5957 opened 8 months ago

mason5957 commented 8 months ago

Thank you for your efforts. I'm curious to know if there are any codes or scripts for quantizing my own 2-bit stable diffusion models, rather than relying on the pre-existing model available on Google Drive.

ThisisBillhe commented 8 months ago

You may refer to my other repos PTQD and torch_quantizer (8-bit only)