Closed Yiximail closed 7 months ago
The latest transformers require safetensors>=0.3.1
safetensors>=0.3.1
https://github.com/huggingface/transformers/blob/8f093fb799246f7dd9104ff44728da0c53a9f67a/setup.py#L162
I tried changing GPTQ-for-LLaMa's requirements.txt to safetensors==0.3.1 and it worked but need a more professional person to determine if it is feasible
requirements.txt
safetensors==0.3.1
The latest transformers require
safetensors>=0.3.1
https://github.com/huggingface/transformers/blob/8f093fb799246f7dd9104ff44728da0c53a9f67a/setup.py#L162