ali-vilab / Ranni

https://ranni-t2i.github.io/Ranni/
Apache License 2.0
207 stars 15 forks source link

Quantized Model Usage? #15

Open GalenMarek14 opened 5 months ago

GalenMarek14 commented 5 months ago

Is there a way to use quantized models? The current version is really out of reach for most people with consumer-grade GPUs.

Also, will you train / release models for SD1.5 and SDXL?

Thanks!

thss15fyt commented 5 months ago

Thanks for your interest.

  1. Currently we do not have plan to develop quantized model of Ranni. But for the LLM part, it might be easy to incorporate quantization tools from community for the LLama2 model used here.
  2. For 1.5/SDXL, refer to #4