issues
search
IST-DASLab
/
gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
https://arxiv.org/abs/2210.17323
Apache License 2.0
1.81k
stars
145
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
bfp full block quantization - algo 3
#6
isoloveychik
closed
1 year ago
0
Inference of the Quantised Model (OPT-13B)
#5
anujnayyar1
closed
1 year ago
1
Why are PPL so low on PTB?
#4
EliottZemour
closed
1 year ago
1
Application to GPT-J family
#3
khadijakarchaoui
closed
1 year ago
1
qweight is empty when I gave --save option
#2
junsoo999
closed
1 year ago
5
Reproduction of the results in the paper
#1
caseus-viridis
closed
1 year ago
2
Previous