turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.28k stars 243 forks source link

Merge experimental #228

Closed turboderp closed 7 months ago

turboderp commented 7 months ago

Mixtral support New quant optimization scheme Various improvements and bugfixes