huggingface / optimum-intel

🤗 Optimum Intel: Accelerate inference with Intel optimization tools
https://huggingface.co/docs/optimum/main/en/intel/index
Apache License 2.0
402 stars 112 forks source link

Problem about about text-image #178

Closed Christianqling2 closed 1 year ago

Christianqling2 commented 1 year ago

I test the code on the examples of text-image and I found that load_quantized_model will use the original model to replace the quantized model. Can you tell me the version of the packages for this example? Or it is just a fake load_quantized_model?

echarlaix commented 1 year ago

Hi @Christianqling2,

We recently introduced the INCStableDiffusionPipeline class to enable the loading of INC quantized Stable Diffusion models, the examples were also updated. To try it out, you can install optimum-intel from source. Let us know if you encounter any issues!

JiaojiaoYe1994 commented 1 year ago

Hi, greate implementations! By the way, I have a question regarding the static PTQ, does it support other granuality except for INT8.