IST-DASLab / gptq

Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
https://arxiv.org/abs/2210.17323
Apache License 2.0
1.81k stars 145 forks source link

Dct 1d v1 #54

Closed igormolybog closed 4 months ago

igormolybog commented 4 months ago

Experimental change that does compression through dct of the column within the algorithm instead of the naive quantization