qwopqwop200 / GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ
Apache License 2.0
2.98k stars 457 forks source link

Dependency conflicts for `safetensors` #260

Closed Yiximail closed 7 months ago

Yiximail commented 1 year ago

The latest transformers require safetensors>=0.3.1

https://github.com/huggingface/transformers/blob/8f093fb799246f7dd9104ff44728da0c53a9f67a/setup.py#L162

Yiximail commented 1 year ago

I tried changing GPTQ-for-LLaMa's requirements.txt to safetensors==0.3.1 and it worked but need a more professional person to determine if it is feasible