chu-tianxiang / QuIP-for-all

QuIP quantization
GNU General Public License v3.0
46 stars 5 forks source link

Support hqq model on vllm-gptq #8

Closed Minami-su closed 6 months ago

Minami-su commented 7 months ago

Because vllm-gptq does not open issue,so I raise issue here.

https://mobiusml.github.io/hqq_blog/

HQQ is a fast and accurate model quantizer that skips the need for calibration data. It's super simple to implement (just a few lines of code for the optimizer). It can crunch through quantizing the Llama2-70B model in only 4 minutes! 🚀

Hope to use hqq model on vllmgptq.

chu-tianxiang commented 7 months ago

Sorry, I missed the message. I'll look into later.