issues
search
qwopqwop200
/
GPTQ-for-LLaMa
4 bits quantization of LLaMA using GPTQ
Apache License 2.0
2.98k
stars
457
forks
source link
I use python llama.py to generate a quantized model, but I can't find the .safetensors model
#254
Closed
jimi202008
closed
1 year ago
jimi202008
commented
1 year ago
wsl的问题
wsl的问题