Zhen-Dong / HAWQ

Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
MIT License
410 stars 83 forks source link

W4A4 precision #24

Open leiwen83 opened 2 years ago

leiwen83 commented 2 years ago

Hi,

I tried quant resnet50 in w8a8 mode, and it achieve good result to 77%, but when I switch to test W4A4, using below command, its accary drop to : Acc@1 34.898 Acc@5 56.298

python quant_train.py -a resnet50 --epochs 1 --lr 0.0001 --batch-size 128 --data /mnt/imagenet/imagenet/ --pretrained --save-path out/ --act-range-momentum=0.99 --wd 1e-4 --data-percentage 0.0001 --fix-BN --checkpoint-iter -1 --quant-scheme uniform4

So whether I need to change number to like learning rate to get higher rate that being reported?

ccw-li commented 2 years ago

I have a similar problem.

When I try to quantize RESNET18 in W4A4 mode, the accuracy is very low: Acc@1 8.634 Acc@5 19.218

Am I running the quantization command correctly?

Thanks in advance.

My command: export CUDA_VISIBLE_DEVICES=0 python quant_train.py -a resnet18 --epochs 1 --lr 0.0001 --batch-size 128 --data ~/datasets/imagenet/jpegs --pretrained --save-path ./checkpoints/ --act-range-momentum=0.99 --wd 1e-4 --data-percentage 0.0001 --fix-BN --checkpoint-iter -1 --quant-scheme uniform4 My environment: centos7 + RTX8000 GPU

jiangaojie commented 2 years ago

I meet the same problem! Can you solve this problem? Thanks

ngrxmu commented 2 years ago

Try to replace some parameters in the command as follows: --epochs 90 --data-percentage 1 This is the complete training.