qwopqwop200 / GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ
Apache License 2.0
2.99k stars 459 forks source link

Update requirements.txt to prevent breaking #267

Closed nikshepsvn closed 1 year ago

nikshepsvn commented 1 year ago

transformers needs accelerate to be >=0.20.3, confirmed still working