ziplab / QTool

Collections of model quantization algorithms. Any issues, please contact Peng Chen (blueardour@gmail.com)
Other
68 stars 16 forks source link

DAIA #2

Closed qiulinzhang closed 3 years ago

qiulinzhang commented 3 years ago

Thanks for your great paper on SR quantization. I have one problem about the method:

DAIA, Is there any other difference from LSQ expcept your first warm-up to initilize the step size?

or did you make specification of LSQ for SR task, thus you get your Distribution-Aware Interval Adaptation?

qiulinzhang commented 3 years ago

Thanks for your great paper on SR quantization. I have one problem about the method:

DAIA, Is there any other difference from LSQ expcept your first warm-up to initilize the step size?

or did you make specification of LSQ for SR task, thus you get your Distribution-Aware Interval Adaptation?

Another problem, Is there experiment results which compare with LSQ or EWGS?

blueardour commented 3 years ago

Hi, we followed the LSQ as the quantizer in the SR task. The main contributions in my opinion are:

  1. Quantizing all layer: not only the non-linear mapping, but also the extraction and reconstruction.
  2. The self-supervised loss function, which show obvious benefit on SR task.

For comparasion with LSQ, please refer to Table 5. There is about 1.3 PSNR improvement of FQSR over the LSQ.

blueardour commented 3 years ago

Re-open if further questions