qwopqwop200 / GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ
Apache License 2.0
2.98k stars 457 forks source link

Could not obtain official perplexity using bloom_eval() #272

Open xingyueye opened 1 year ago

xingyueye commented 1 year ago

Hi, I ran the bloom.py using fp16 to test the perplexity (PPL) of BLOOM on Wikitext-2, PTB, and C4 datasets. The results are 11.79 / 20.14 / 17.68, which is worse than the official results of 11.37/19.40/14.13.