bes-dev / stable_diffusion.openvino

Apache License 2.0
1.53k stars 205 forks source link

Does it support int8 quantization? #127

Open O-O1024 opened 1 year ago

O-O1024 commented 1 year ago

if it supports int8 quantization, the model size can be reduced 3/4. May be faster.

RedAndr commented 1 year ago

Yes, it is twice as small as the 16-bit model, but the quality is terrible, I must admit.