qwopqwop200 / GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ
Apache License 2.0
2.98k stars 457 forks source link

fixes safe tensor compatibility and c4 dataset breaking #290

Open a5hwinjs opened 2 months ago