Closed Christianqling2 closed 1 year ago
Hi @Christianqling2,
We recently introduced the INCStableDiffusionPipeline
class to enable the loading of INC quantized Stable Diffusion models, the examples were also updated. To try it out, you can install optimum-intel
from source. Let us know if you encounter any issues!
Hi, greate implementations! By the way, I have a question regarding the static PTQ, does it support other granuality except for INT8.
I test the code on the examples of text-image and I found that load_quantized_model will use the original model to replace the quantized model. Can you tell me the version of the packages for this example? Or it is just a fake load_quantized_model?