turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.28k stars 243 forks source link

support for awq #193

Closed frankxyy closed 1 month ago

frankxyy commented 7 months ago

Hi, can the inference of an awq quantized model be implemented? Thank you!