issues
search
xvyaward
/
owq
Code for the AAAI 2024 Oral paper "OWQ: Outlier-Aware Weight Quantization for Efficient Fine-Tuning and Inference of Large Language Models".
https://arxiv.org/abs/2306.02272
53
stars
5
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
llama 3 8B ppl
#5
dorsa-zeinali
opened
2 months ago
0
model_config.json
#4
dorsa-zeinali
opened
2 months ago
1
Llama 2 perplexity results on wikitext2?
#3
seannz
closed
2 months ago
6
NaN ppl when running on Llama2-7b without owq (--wbits 4 --target_bit 4).
#2
hopef
closed
7 months ago
0
Failed to reproduce the LLama7B perplexity on the Penn Treebank (PTB) dataset.
#1
hopef
closed
8 months ago
2